Data-Driven Business Process Management-Based
Data-Driven Business Process Management-Based
net/publication/356962028
CITATIONS READS
29 2,567
4 authors:
All content following this page was uploaded by János Abonyi on 14 December 2021.
Abstract
∗ correspondingauthor
Email address: [email protected] (Tamás Ruppert*)
2
need to rethink as well as redesign business processes and models [8], for which
BPR in the context of design science research (DSR) is effective.
”BPR is an organisational initiative for achieving competitive multi-faceted
advantages regarding business processes, in terms of cycle time, quality, cost,
35 customer satisfaction and other critical performance metrics” [9]. DSR con-
tributes to business process redesign through iterative process analysis. DSR
is a problem-solving paradigm that supports innovative, knowledge- and data-
centric artifacts creation in the form of models, methods, constructs and in-
santiations [10]. In a I4.0 manufacturing organization context, artifacts are
40 considered to be I4.0 technologies, tools and methods. The concept of DSR can
be carried through the whole BPM process life cycle.
Modelling notations are considered as tools of BPR, such as the Business
Process Model and Notion (BPMN), Unified Modeling Language (UML), XML
Process Definition Language (XPDL), Petri Net, Integration Definition Method
45 (IDEF0 and IDEF3) and Business Process Execution Language for Web Ser-
vices (BPEL4WS) [11]. However, in the following, only the BPMN is discussed
more precisely as it is mainly for business process modeling and its structure
is clearer and easier to use [11]. BPMN creates a standardised bridge over the
gap between business process design and process implementation [12]. This
50 technique promotes the automated execution of workflows and provides a com-
prehensive model, the graphical notion of which is a Business Process Diagram
(BPD) [12]. An adequate BPMN model is needed to gain an easily applica-
ble data-centric workflow optimisation. The defined processes in BPMN can
be further improved and optimised by applying the business process simulation
55 approach [13]. Simulation techniques allow for complex data-intensive processes
to be analysed and diverse scenarios evaluated [14]. However, the potential of
digital twins [15] provides a new level of efficiency in terms of adapting and
controlling large complex business processes [16].
The purpose of this paper is to develop a methodology to improve data-
60 driven process development, resulting in an evidence-based decision support
tool in BPM. It aims to facilitate digital transformation and the conversion of
3
production into flexible manufacturing systems, therefore, improving business
agility.
Organisational I4.0 developments require a clear understanding of the ’as-is’
65 state [17]. Maturity models highlight the strengths and weaknesses [18]. Ma-
turity models measure the readiness to adapt digital transformation and data-
driven solutions. The adaption of the new technologies and strategic operational
processes requires the redesign of the business processes.
The successful realization of the I4.0 concept demands the applications of
70 ICT and automation technologies as well as the related standards and ontologies.
Trends in I4.0 define the need for interoperability amongst actors, sensors, and
heterogeneous systems [19]. Standards, formal models and ontologies play a
significant role in the I4.0 system environment, for which enforcement, linked
data should rely on systematically enriched information [20].
75 Such I4.0 standards describing reference architectures for smart factories
are the ISA-95 [21], ISA-S88 [22] or RAMI4.0 [23] models. Standards Ontology
describes the characteristics as well as relations between standards and organi-
zations for publishing standards in addition to standardization frameworks in
the context of I4.0 [24]. Ontologies can be considered as the next generation
80 of standards for connecting information [20]. To develop interoperability, the
Industrial Ontologies Foundry (IOF) works with the governmental, industrial
and academic sectors as well as standards organizations by creating a set of
core and open reference ontologies covering the entire domain of digital manu-
facturing [25]. Therefore, semantic technologies are essential in the field of ICT
85 and automation technology (AT) to exploit and connect information between
and within companies [26]. The development of I4.0-induced business processes
management based on standardized domains can result in a better served and
more transparent structure to manage the robustness of the system with a high
degree of reliability.
90 The overall purpose of this paper is to provide a data-driven method for the
analysis and development of Industry 4.0 solutions based on business process
modeling and simulation.
4
The proposed methodology is based on the improvement of the BPM life
cycle with the integration of a data-driven application. The phases of the BPM
95 life cycle are connected with diverse data-based technologies, tools and methods
to facilitate its successful adaptation and efficient operation. The potential
in data analysis is highly connected to this concept, therefore, this linkage is
highlighted by providing an overview of the integrability of CRISP-DM/ML
standards into the BPM life cycle.
100 This concept is demonstrated through the development project of a Hun-
garian assembly and engineering company that produces customized and fully
autonomous factory units. The illustrative example demonstrates how the re-
design of a production line consisting of Computer Numerical Control (CNC)
machines can be supported by BPM and BPR tools.
105 The aforementioned I4.0-induced business process transformation models,
methods, and support tools are explored in the following sections.
• Section 2.1 highlights that the BPM life cycle of I4.0 technologies should
intensively utilise data-driven process development tools so that devel-
opment projects can be efficiently supported by the integration of the
110 standards of data mining (DM) and machine learning (ML) development
projects into the BPM life cycle.
• Some details related to the applicability of the method are also presented
in an illustrative case study in Section 3.
• Finally, the conclusions and future trends in terms of business data and
process management are discussed in Section 4.
5
120 2. Business Process Transformation in the age of Industry 4.0
6
Figure 1: Recommended steps for implementing strategic development [29]
150 ery).
3. The development of an ’ideal’ vision and agenda for creating the ’to-be’
goal model of an organization (process analysis and redesign).
4. ”Development domains should be prioritized because the creation of In-
dustry 4.0 systems is a complex activity, since the very nature of this
155 concept implies that it affects all business and technical sub-processes”
[31] (process analysis and redesign).
5. An I4.0 road map can serve as the ’to-be’ model of the redesigned pro-
cesses, which can facilitate the transformation process coordinating the
actions (process redesign).
160 6. The realization and implementation of the derived I4.0 road map (process
implementation and monitoring).
7
165 DSR identifies the problem and its impact, thus reveal potential redesign areas.
DSR iterates how would a better artifact (e.g. I4.0 solutions) accomplish in
the problematic environment, develops and design a solution, then apply it and
measure performance metrics to access effectiveness of the redesigned process.
Finally communication of the results is essential for further researches [32]. DSR
170 is a research paradigm with a focus on generating hypotheses on how the future
could look (’to-be’) and iterates solutions for the redesign. It is initially filter-
ing them to remove hypotheses that are not worth pursuing. ”By widening the
employment of DSR, the information science discipline can acquire a leading
position in the field of practice” [33, 34].
175 The aforementioned concept covers and facilitates BPM as well as the in-
tegration of digital technologies, tools and methods for redesigning and im-
plementing I4.0 projects in addition to transformed business processes. The
description of the BPM life cycle model and its technology-driven support as
well as the possible data-analytic connection to this concept is given in Section
180 2.1.
2.1. Development of the Business process management (BPM) life cycle using
data-driven applications and standards
This section discusses the foundations of BPM and the technology align-
ments to support efficient governance for process-oriented organizations as well
185 as highlights the benefits of utilizing the integration of data-driven applications
and standards into the BPM life cycle.
Section 2.1.1 defines how I4.0-driven technologies, tools and methods can
be utilized through phases of the BPM life cycle, while Section 2.1.2 provides
an insight into the potential integration of data mining and machine learning
190 standards (CRISP-DM/ML) into the BPM life cycle.
8
2.1.1. Improvement of the BPM life cycle with the integration of technology
alignments
Business processes can be considered combinations of interconnected events,
activities and decision points that include many actors and information carriers.
195 Altogether, they contribute to a value-added result for at least one customer
[35]. Due to constant digital transformation, managing business processes is
no longer a choice but a necessity to remain competitive and productive in the
market [36]. Although the role of digital technologies in transforming business
processes seems to be under-investigated [37].
200 BPM is a comprehensive system for managing and transforming organiza-
tional operations through business process modeling, execution, and evaluation,
with agility and operational performance goals[30]. To monitor and continu-
ously develop business processes, BPM utilizes several methods, policies, met-
rics, management practices, and software tools [36]. In Figure 2, the model of
205 a BPM life cycle is shown.
Technological developments and digital innovation make it necessary for
BPM to be thought through afresh. ”Emerging technologies have capabilities
to reshape business process management from its traditional version to a more
explorative variant” [38]. Ambidextrous BPM mixtures consist of two aspects:
210 exploiting the benefits of existing technologies (i.e., exploitative BPM), while
simultaneously exploring the benefits of new IT (i.e. explorative BPM) [39].
The innovation- and data-driven perspectives of transforming business mod-
els and processes and the technological tools and methods to support BPM
cannot be neglected. ”Technology is not the goal, but the instrument and pro-
215 cess excellence is the driver to introduce new technologies in the operations”
[40].
The digitization of manufacturing enterprises creates the basis for I4.0 prin-
ciples such as interoperability, decentralization, virtualization, real-time capa-
bility, service orientation, and modularity [36]. Organizations seek to be stable,
220 flexible and capable of adapting to changes, while organizational stability pro-
9
Figure 2: A model of the Business Process Management (BPM) life cycle
10
235 decision-making through the BPM life cycle.
In Table 1, we determine I4.0-induced, data-driven applications, tools, and
methods can be integrated into the phases of the BPM life cycle, therefore,
facilitate more efficient evidence-based decision-making. The table includes a
short definition of the phases of the BPM life cycle based on Ref. [35], then
240 the supporting tools,that enable the operation of the phase to be improved are
discussed then the references and examples regarding its benefits are defined.
11
Table 1: Industry 4.0-induced applications, tools and methods supporting the phases of the Business Process Management (BPM) life cycle
BPM life cycle phase Definition Supporting tools Benefits References
Processes relevant to the problem being addressed are identified, delimited, Transparency,
Maturity models, BPMN,
and inter-related. The outcome of process identification is a new or updated Virtualization, [48],[49],[50],
Process identification Process mining, Cloud computing,
process architecture, which provides an overall picture of the processes in Real-time capability, [51], [52], [53]
Digital twin
an organization and their relationships with each other. Interoperability
The current state of each of the relevant processes is documented, typically Process mining, Cloud computing, Transparency, [51], [52], [53],
Process discovery
in the form of one or several ’as-is’ process models. Digital twin Virtualization [54]
Issues associated with the ’as-is’ process model are identified, documented, and,
Transparency,
whenever possible, quantified using performance measures. The output of Big data analytics, KPIs, Bottleneck [52], [53], [55],
Interoperability,
Process analysis this phase is a structured collection of issues. These issues are prioritized analysis, Cloud computing, AI, [56], [57], [58],
Real-time capability,
based on their potential impact and the estimated effort required to resolve Process mining, Process model repair [59], [60],
Interconnectivity
them.
12
Multiple change options are analyzed and compared in terms of the chosen
performance measures. Process redesign and process analysis go hand in Transparency, [14], [61], [62],
Optimization techniques, Standards,
hand: as new change options are proposed, they are analyzed using process Interoperability, [63], [64], [65],
Process redesign Ontologies and semantic models,
analysis techniques. Eventually, the most promising change options are Interconnectivity, [66], [67], [68]
Simulation and modeling techniques
retained and combined into a redesigned process. The output of this phase Virtualization [69], [70], [71]
is typically a ’to-be’ process model.
The changes required to move from the ’as-is’ process to the ’to-be’ process
are prepared and performed. Process implementation covers two aspects: RPA, IoT, CPS, Cloud computing,
Interoperability,
organizational change management and automation. Organizational change Digital twin, Cloud manufacturing, [72], [73], [74]
Interconnectivity,
Process implementation management refers to the set of activities required to change the way of Simulation and modeling techniques, [75], [76], [77],
Real-time capability,
participants involved in the process work. Process automation Collaborative/Autonomous robots, [78]
Virtualization
refers to the development and deployment of IT systems (or enhanced AI, AR
versions of existing IT systems) that support the ’to-be’ process.
Once the redesigned process is running, relevant data are collected and
analyzed to determine how well the process is performing with respect to its
Interoperability,
performance measures and performance objectives. Bottlenecks, recurrent IoT, CPS, Cloud computing, MES,
Process monitoring and Interconnectivity, [53], [55], [77],
errors, or deviations with regard to the intended behaviour are identified and Simulation and modeling techniques,
controlling Real-time capability [79] [80],[78]
corrective actions are undertaken. New issues may then arise, in the same AI, AR, Process mining, Digital twin
Virtualization
or in other processes, which requires the cycle to be repeated on a continuous
basis.
In the following, the phases of the BPM life cycle are explored with a signif-
icant focus on supporting digital technologies, tools and methods.
The process identification phase can be supported by organizational ma-
245 turity models, digital technologies and tools such as process mining, the Business
Process Model and Notion (BPMN) and the Digital Twin of an Organization
(DTO). Conceptualizing and measuring the maturity of an organization or a
process facilitates the understanding of business architectures, therefore, allows
the ’as-is’ state of processes to be detected [48]. BPMN 2.0 is a standard no-
250 tation for modeling business processes [81]. The flat control flow perspective,
sub-processes, data flows and resources can be integrated within one BPMN di-
agram, which is interesting from the process miners’ point of view [49]. BPMN
serves as a representational of modeling processes and opens excellent perspec-
tives for the applicability of process mining. ”Discovered process models be-
255 come available and understandable, and the models can be imported/exported
from/to any BPMN modeling tool and executed, process mining techniques can
be easily integrated to the existing suites (BPMN serves as an interface in this
case). Moreover, BPMN models allow for the combination of different per-
spectives varying from control flow (including sub-processes) to the perspective
260 of resources” [49]. Process mining is an innovative tool supporting the digital
transformation of companies by providing a holistic insight into actual processes
and complexities, therefore, allowing the identification of inefficiencies and effort
drivers [50]. Process mining techniques should support workflow models such as
BPMN [53]. Models extracted from event data provides a purposeful abstraction
265 of reality. Therefore, due to the dynamic nature of processes, process mining
should be considered as a continuous process to create a ”living and learning
process model” [53]. The next level realization of current processes is utilizing
digital twins. Digital Twin of an Organization (DTO) serves as a virtual copy of
processes, therefore, promotes the understanding of process architectures [52].
270 A DTO is “a dynamic software model of any organization that relies on opera-
tional and/or other data to understand how an organization operationalizes its
business model, connects with its current state, responds to changes, deploys
13
resources and delivers expected customer value” [51].
Process discovery can be facilitated by process mining and the perspective
275 of a DTO. Process mining enables businesses to analyze business processes based
on digital traces captured in event logs. It is an effective tool to visualize the
’as-is’ process flows and check their actual conformance [52]. A DTO provides
a virtual interpretation of actual process flows [51], therefore, allows a punctual
’as-is’ model of processes to be created. Furthermore, the Business Process
280 Catalog and Classification Scheme (BPCCS) can foster the discovery of business
process elements by providing the shared vocabulary of various business process
models (e.g. activities, sub-processes, roles) [82]. ”Combining Robotic Process
Automation (RPA) with the popular BPM approach poses a useful strategy as
the ’as-is’ process is optimized first”. Subsequently, the RPA procedure is then
285 deployed on the optimized process to reach its full automation potential [54].
The phase of process analysis can be improved by process mining, big data
analytics, KPIs, artificial intelligence (AI) or diverse analyzing techniques such
as bottleneck analysis. Big data management and analytics cannot be neglected
in terms of process analysis. Big data analytics allows business processes to be
290 measured and knowledge to be directly translated into decision-making and per-
formance [83]. Big data enables internal processes to be streamlined by reducing
the bottlenecks, therefore, increasing the efficiency of processes [59]. Identifying
bottlenecks is essential as it affects the operation of the whole system. The bot-
tleneck of a manufacturing system refers to a subsystem that limits the capacity
295 of the entire system. Overall Equipment Effectiveness (OEE) can be used as
a KPI that indicates a system’s overall capacities and subsystems. However,
the lowest OEE may not provide information about bottlenecks because other
subsystems can cause them. To overcome this drawback, the standalone OEE
(OEEsa ) was formed that reveals the true capability of the system, which is
300 partially hidden due to external factors [58]. KPIs measure performance in
terms of business goals, typically reflecting the time, cost and quality domain.
A direct link exists between KPIs, objectives, and strategies [47]. In terms of
evaluating ’as-is’ processes, registered KPI values can be verified by process
14
mining to determine deviations from KPI target values, therefore, facilitate the
305 redesign phase [84].
Process mining enables structured data from a process perspective to be
analyzed. In some cases, it allows KPIs to be modified in ways that optimize
the output to facilitate data-based root cause analysis, which enables errors to
be directly connected to relevant process steps as well as identifies unwanted
310 effects [55]. Process mining techniques enable to observe behaviour (e.g. event
logs) to modeled behaviour (e.g. BPMN model), therefore, detect and diagnose
differences between observed and modeled behaviour [60]. Process Model Re-
pair uses a reference process model and an event log as an input. If the model
cannot fully explain the observed behaviour, these parts of the model which are
315 not consistent with the observations can be repaired using this technique [60].
Furthermore, the constantly improving technology fosters the continuous devel-
opment of business processes; thus, business process redesign can be considered
as a process enhancement operation (process model repair). Process mining can
be further aided by AI, which offers possibilities in predictive process mining
320 [55]. Predictive process mining plays a significant role in business scenarios as
it enables the next activity of a running process to be identified in advance.
Therefore, it can foster ”optimal management of resources and promptly trig-
ger remedial operations to be carried out” [57]. The data gathered about the
execution of the processes provides information about decision patterns, perfor-
325 mance, or control-flow provides, which information can be utilized in simulation
models to perform a new complex process model [53]. The comprehensive twin
of all process activities and complexities provides the strategic management of
an organization with insights based on facts. Strategic directions such as digital
transformation programs become measurable and thus easier to manage [52].
330 The process redesign phase of the BPM life cycle can be improved by I4.0-
induced technologies, models and tools, in addition to optimization techniques,
standards, ontologies and semantic models, as well as simulation and modeling
techniques. A major aspect of I4.0 is to provide a coherent approach for adapting
semantic communication between intelligent systems. To redesigning businesses
15
335 according to this approach, a communication model is needed for which ontolo-
gies can provide a solution by formalizing the knowledge of smart manufacturing
in an interoperable way. ”An unambiguous, semantic-based knowledge repre-
sentation of concepts for smart manufacturing domain is required to ensure a
coherent and effective human-robot collaboration” [62]. Industry practitioners
340 can use these ontologies to conceptualize implementation scenarios [63]. Indeed,
as every scenario considered within the framework of I4.0 includes different en-
tities which communicate and cooperate, the main role of the presented onto-
logical standard is to facilitate that exchange [62]. According to the domains of
I4.0, Factory 4.0 or smart manufacturing, both the perspectives of business and
345 production are involved. ”As I4.0 relies heavily on robotic agents which have to
evolve and perform the main operations in smart manufacturing environment
and which are solicited to communicate with human operators, customers, or
with diverse distributed partners. The standardization of knowledge represen-
tation is a key element facing I4.0 development and is required to be addressed
350 quickly and efficiently to avoid accumulated difficulties at later stages of the de-
velopment” [62]. Therefore, ontological standardization regarding I4.0 mainly
relies on IEEE 1872-2015 - Standard Ontologies for Robotics and Automation
[64] which can be extended by specific ontologies. These ontologies, for exam-
ple, the Core Ontology for Robotics and Automation (CORA), Ontology for
355 Autonomous Robotics (ROA), Ontology for Robotic Architecture (ORArch) or
Ontology for Industry 4.0 (O4I4). During process redesign, the architecture of
the ’to-be’ process model is developed, which in the I4.0 environment cannot
neglect the connection between hardware and software. This viewpoint can be
supported by ORArch that elaborates on how hardware and software can be
360 represented together in mixed architecture descriptions [62], while O4I4 is a
business-focused ontology [62].
Due to the dynamic nature of the business environment, changes in work-
flow models of processes are becoming increasingly common. ”Process mining
application provides a visual and fact-based proof for automation capabilities
365 and enables prioritization of activities” [79]. Goal-oriented (’to-be’) dynamic
16
workflows seek to provide flexibility concerning the execution of business pro-
cesses. ”BPMN or similar languages must continue to be the primary instru-
ments for defining a workflow” [65]. In general, optimization procedures still
do not support full automation, hence automated optimization solutions are
370 in demand, covering the monitoring, identification and modification of process
structures [66]. For modeling processes in Cyber-Physical Systems (CPS), the
BPMN4CPS model has been proposed [71].
The extension of BPMN is based on the simulation of Cyber-Physical Pro-
duction Systems (CPPS) in terms of reconfigurability, adaptability and relia-
375 bility. This approach relies on the principles and standards of Model-Driven
Architecture (MDA) to improve automation and customization [70]. Ref. [14]
presented a largely automated business process modeling and simulation ap-
proach based on BPMN. This approach ”first annotates the BPMN model with
what is necessary to make the model executable (i.e., performance parameters,
380 execution resources, and expected workload), and then automatically maps the
annotated model into simulation code ready to be executed” [14]. Further-
more, the effective implementation of production goals can be achieved by the
combination of manufacturing process models. ”To accomplish this, a work-
flow control engine needs to be designed and standardized respectively” in IIoT
385 Systems that apply the Arrowhead Framework [69]. IIoT Systems that ap-
ply the Arrowhead Framework addresses how BPMN and Coloured Petri Nets
(CPN) [85] can be implemented in the Arrowhead Framework, which is an IIoT
(Industrial IoT) framework that dynamically and flexibly supports automated
manufacturing processes following I4.0 expectations [68]. Furthermore, Robotic
390 Process Automation (RPA) is an emerging technology to automate business
processes that are driven by user interaction with software systems [86]. The
integration of RPA into BPM allows their technologies to be connected and
systematic methods combined [87].
The process implementation phase of the BPM life cycle has much poten-
395 tial to be supported by developing technologies and models. In addition to RPA,
emerging phenomena include collaborative and autonomous robots, cloud man-
17
ufacturing, IoT, Digital Twins, simulation and modeling, and the umbrella term
of AI. Cloud manufacturing is a networked, intelligent manufacturing model that
integrates IoT, the Semantic Web, cloud computing, high-performance comput-
400 ing and informatized manufacturing to derive a reliable, highly efficient, and
quality service for the whole manufacturing life cycle [72]. Cloud manufactur-
ing should be integrated into the BPM life cycle to support business process
evolution and adaptation as well as automate this procedure using the Busi-
ness Process Cataloging and Classification System (BPCCS) [73]. In terms of
405 improving the design and implementation of BPM, a sensor-based IoT network
technology can aid this process [74]. IIoT further empowers companies to adopt
data-driven strategies and solutions. Being flexible and adaptive to changes in
business processes, this paradigm transform industries into cyber-physical pro-
duction systems [75]. Furthermore, the process-based framework of self-adaptive
410 workflows for CPS ensures agility and autonomy [76]. Implementing Enterprise
Resource Planning (ERP) in an organization, the phases of the BPM life cy-
cle must be continuously managed and monitored. According to an Indonesian
case study, leading companies ”implementing ERP for more than five years,
obtained high scores for BPM implementation. They perform well in terms of
415 process identification, implementation, monitoring and control, but are weak
in process discovery and redesign, mainly because they do not optimally use
specific tools for process modeling and there is a lack of process governance”
[77]. The implementation of Robotic Process Automation (RPA) provides a
virtual workforce to automatize manual, repetitive and error-prone tasks, the
420 solutions of which will lead to the concept of socio-cyber-physical work systems
[88]. The digital replica (digital twin) of the system can be efficient to test and
continuously monitor the changes in business processes. Digital twin (DT) is an
emerging technology used in intelligent manufacturing, which is the digital copy
of a physical system and captures its attributes and behaviours in real-time
425 and predicts (simulate) system failures. The implementation of digital twins
enables us to gain an understanding of how the physical system would operate
after implementing changes due to business process redesign [78].
18
The process monitoring and controlling of the whole life cycle is essen-
tial for the successful operation. This phase of the BPM model spans the whole
430 life cycle, not only the monitoring of the results but continuous data collection
and model analysis. These activities can be highly supported by data-driven
technologies, AI and ML functions, CPS and simulation, as well as process
mining or digital twin technologies. Process mining can be used as a ”monitor-
ing tool in production that allows data-based, fast decisions. These can help
435 to reduce rework, improve the quality of our products, and reduce production
costs” [55]. Process mining further allows the visual and fact-based proof of
automation capabilities, prioritizes activities, and supports process automation.
Successful process automation demands knowledge about automation potential,
effective training of the robots and continuous monitoring of their performance.
440 [79]. Business process automation must be continuously managed and monitored
[77]. Regarding manufacturing companies, trends have shifted towards individ-
ualized mass production, which presents challenges in achieving interoperability
between physical and digital systems concerning the intelligent organization of
resources. To manage this, a digital twin-driven Manufacturing Cyber-Physical
445 System (MCPS) for parallel controlling of a smart workshop under a mass in-
dividualization paradigm can be a solution [80]. The continuous monitoring
and prediction of production systems are critical features for which intelligent
manufacturing systems provide a real-time solution resulting in lower cost, in-
creased flexibility, higher productivity, and better quality [89]. The extension
450 of DT is the Hybrid Digital Twin (HT), which ”recognize, forecast and commu-
nicate less optimal (but predictable) behaviour of the physical counterpart well
before such behaviour occurs. An HT integrates data from various sources (e.g.,
sensors, databases, simulations) with the DT models and applies AI analytics
techniques to achieve higher predictive capabilities while at the same time opti-
455 mizing, monitoring and controlling the behaviour of the physical system. An HT
is typically materialized as a set of interconnected models, achieving symbiosis
among the DT models” [78]. Beyond DT and HT, the concept of cognitive dig-
ital twin (CT) encompasses some forms of cognitive capabilities, which enable
19
improved data-, knowledge- and experience-based automatic decision making.
460 CT overarch the whole life cycle of the system [78].
20
that the development process of a given ML application is of sufficient quality
490 to warrant the adoption into business processes”. The CRISP-ML model is
shown in Figure 3 and consists of six phases: business and data understanding,
data preparation, modeling, evaluation, deployment and monitoring, and main-
tenance. The sequence of these phases can vary and the arrows indicate the
most frequent and important dependencies between phases. Iteration and the
495 revisiting of previous steps are needed until success is achieved or criteria are
met [94].
21
I4.0 environment efficiently. Table 2 describes that the key features of CRISP-
ML(Q) can be fitted into the phases of the BPM life cycle well as defines the
outputs of each phases [95].
22
Table 2: Intertwining features between CRISP-ML(Q) and the phases of the BPM life cycle ensuring the suitability of integration
CRISP-ML(Q) phases BPM life cycle phases Features Outputs
Business understanding Process identification Creating an overall process architecture, defining business scope and objective. Determined business background and objectives.
Assessed situation, detemined data mining/
Prodess discovery Defining ’as-is’ state to assure feasibility.
machine learning goals and success criteria.
Produced project plan, initial assessment of
tools and techniques.
Collecting and verifying the data quality regarding the structured issues and, Collected initial data, described, explored and
Data understanding Process analysis
finally deciding upon whether the project should be committed. verified data.
Created dataset and its description (selected,
Data preparation Process analysis Producting a data set for the modeling/design phase.
cleaned, constructed, integrated, formated data).
Creating models that satisfy the given constraints and requirements (’to-be’ Selected modeling technique, generated
23
Modeling Process redesign
process model). test design, built and assessed model.
Design and analysis are intertwinned and as new change options are proposed,
Evaluation Process analysis they are analyzed using process analysis techniques. The performance, Evaluated results, reviewed process.
robustness and explainability must be evaluated.
Process redesign Determined further steps.
Preparations to move from the ’as-is’ process to the ’to-be’ process are made
and the transformation is carried out. Before rolling out a model to all, it is Planned deployment, produced final
Deployment Process implementation
best practice to deploy it first to a small subset and evaluate its behaviour in a report and reviewed project.
real-world environment (called canary deployment).
Monitoring and maintenance processes to assure quality performance and
Monitoring and maintenance Process monitoring and control identify corrective actions. The model has to adapt to changes in the Planned monitoring and maintenance plan.
environment.
505 The management of business processes by supporting digitization and au-
tomated processes via AI has great potential for organizations. The consistent
redesign of processes to implement the I4.0 environment is critical, as is the op-
portunity for companies to use their data and make intelligent, profitable and
real-time decisions [44]. ML-related standards, tools, methods, and approaches
510 promote process automation and optimization as well as facilitate BPM through
new algorithms and independent learning through continuous data analysis [44].
Automated optimization solutions are required, covering monitoring, identifica-
tion and modification of the process structure [66]. A crucial element of the
redesign of business processes according to the concept I4.0, the standardized
515 knowledge representation [62]. In terms of I4.0 and the intelligent manufac-
turing environment, ontological standardization relies highly on robotics and
automation to facilitate information exchange for communication and coopera-
tion.
Industrial data mining and ML-related process standards (CRISP-DM/ML)
520 are comprehensive process models for data mining/machine learning that pro-
vide a foundation for developing a specialized process model (’to-be’) and define
steps to achieve it [96, 94]. These steps are highly interrelated to the phases
of the BPM life cycle, as is highlighted in Figure 3. Integrating the constantly
developing technologies and diverse standards into the BPM model facilitates
525 I4.0 fitting as well as highly flexible, interconnected and agile business processes.
Furthermore, BPMN models describing both ’as-is’ and ’to-be’ process models
ensure transparency. Simulation models of the processes support optimization
by revealing bottlenecks and validating structural efficiency. Therefore, the dig-
ital twin of an organization based on the digital copy of a physical asset allows
530 real-time data of the current process to be assessed and collected, information
of the interaction with the environment to be derived and information traces
captured.
24
2.2. Methodology - guideline for managing data-driven business processes
This section discusses the methodology for developing business process man-
535 agement. This paper provides a guideline for realizing I4.0-driven business pro-
cesses by utilizing technological developments. Based on the previously pre-
sented detailed analysis, this section outlines how and at which stage of a project
emerging technologies and data-driven decision support tools can be utilized to
endure process efficiency and rebound business agility.
540 We suggest following the BPM life cycle (Figure 2) through innovative I4.0
projects which require business processes to be redesigned through iterative,
knowledge- and data-centred process analysis, following the design science re-
search paradigm. BPM is a comprehensive system to transform business opera-
tions through modeling, execution and evaluation. Its life cycle model provides
545 a transparent reference ’step by step’ model to transform processes successfully.
To ensure business agility, competitiveness and smart growth, organizations
must accomplish a transparent, interconnected, real-time and virtual data-based
workflow management. These features can be derived by utilizing the potential
of, for example, process mining, IoT, BPMN models, ontologies and semantic
550 models, robotic process automation, digital twins and simulation models. These
technological alignments are categorized into the phases of the BPM life cycle
outlined in Table 1. Furthermore, we suggest integrating business analytical
standards into the BPM life cycle, such as CRISP-DM/ML. Table 2 defines the
connectivity features between the models. Businesses’ vast amount of heteroge-
555 neous data must be managed to continuously recognize, analyze, improve, and
implement changes within business processes. Since data mining and machine
learning projects closely related to AI tend to increase in an industrial concept,
supporting cross-industry standards for machine learning and data mining are
essential. Especially in manufacturing, which is of the most promising area to
560 successfully apply digital twin for maintenance and operation monitoring and
optimization purposes [97].
Table 3 presents a higher level of integrability of the concept mentioned above
related to how phases of the BPM life cycle, as well as data-driven standards
25
and ontologies, are related at different levels of the organizational hierarchy
565 (business, process, implementation). The BPM life cycle and CRISP-ML model
is broken down according to organizational hierarchy levels. This point of view
supports the comprehensive understanding of which stage of the project belongs
to which level of organizational operation.
26
Table 3: The intertwining of the BPM life cycle and standard models at the levels of organizational hierarchy
Hierarchical levels Definition BPM life cycle phasis CRISP-DM/ML phases Supporting standards
Process identification, BPMN, UML AD, B2MML,
Business Used by executives and senior business managers Business understanding
Process discovery OPC UA: IEC 62541, BPDM,
to organize their overall understanding, evaluation, Process analysis Data understanding OI40, CDD
Process monitoring Monitoring and
and management of a business’s performance.
and controlling maintenance
Process Used by managers, employees, business analysts, Process analysis Data preparation, Evaluation BPMN, ISO/PRF TR 24464,
and human performance analysts to change how Process redesign Modeling ALPS, ADACOR, BPQL, OPC
Process monitoring Monitoring and UA: IEC 62541, BPML, BPDM,
processes work.
and controlling maintenance CDD, IEEE 1872-2015 standard
Technology used to implement processes according AutomationML, FDI, EDDL,
Implementation Process implementation Deployment
a context defined in upper levels. It can also be used to AASX, WSBPEL, EDI, ORArch
27
Process monitoring Monitoring and
to develop job descriptions or hire new collaborators.
and controlling maintenance
3. Lessons learned during the redesign of a CNC production line
570 This section presents a case study in which a BPM life cycle and I4.0-driven
tools are applied to redesign a production line of CNC machines. The case
study is of a Hungarian assembly and engineering company oriented in Industry
4.0 technology applications. The project aims to redesign the production line
of CNC machines into a fully automated ’smart’ factory unit and replace the
575 current product tracking system with a flexible manufacturing execution system
(MES). This case of redesigning the business process should significantly build
on data-driven and I4.0-induced developments, tools and methods as the design
of the planned automated factory unit and integration of the manufacturing ex-
ecution system into the current phases require the detailed analysis and redesign
580 of the business processes.
In this section, the details of the guideline proposed in Section 2.2 are pre-
sented.
The purpose of this demonstration is twofold: 1) the development project
585 through the phases of the BPM life cycle and its decision support tools, then 2)
to create the BPMN-based simulation model of the ’to-be’ model to validate its
structure as well as examine and analyze its processes. Simulation of the process
is used as a tool to validate and continuously monitor the efficient operation of
the tasks as well as gather real-time information traces and data sets. Figure 4
590 shows the workflow of redesigning business processes while considering business
analytic (CRISP-DM/ML) aspects at a Hungarian assembly company. The
workflow follows the data-based BPM life cycle steps and indicates tools and
methods applied at each step.
The current processes were identified through observations, in-depth inter-
595 views with employees, organizational databases and enabled to gain a compre-
hensive understanding of business background and objectives. Processes were
discovered by analyzing log files, namely registries of events, processes, messages
28
Figure 4: Workflow of redesigning business processes at a Hungarian assembly company
and communications, that were utilized to recreate the proper structure and at-
tributes of processes. As soon as we see how the company works, improvement
600 ideas undoubtedly will show up. BPMN modeling helps to understand the an-
swers to the questions like: How do they work? Why do they work based on
this strategy? How can it be improved?. In order to graphically reconstruct the
’as-is’ state, a BPMN model of business processes is built to ensure transparency
and support the identification and discovery phases and bottlenecks. The data-
605 based analysis of the current process model was carried out by process- and
data mining, analysis of KPIs and bottleneck, and ANOVA analysis to reveal
weaknesses and possible development areas. Examination of the extracted data
and log files enabled to define bottlenecks to be identified. The redesign of
business processes was carried out using BPMN modeling for the desired au-
610 tomated manufacturing process. A BPMN model of the ’to-be’ improved the
current processes, and the automated factory unit ensured its transparency and
ability to be simulated (deployed). The improvements based on the KPIs can
29
be evaluated similarly to Ref. [98]. As the application study is based on a con-
tinuous development project at the Hungarian assembly company, there were
615 no opportunities to implement the ’to-be’ model in a traditional (physical) way.
Therefore, a simulation model was developed from one hand to previously as-
sess how the changes would affect the outcome and other processes and reveal
further potential bottlenecks. The simulation of the business process model
includes information about time periods, executors and resources to analyze
620 processes and reveal possible bottlenecks. A simulation technique is used to
find the potential flaws in the current processes and to realize/optimize the de-
sign of the process improvements/experiments [99]. The developed simulation
model enables the process structure and allocations to be validated, thereby
ensuring the implementation phase is efficient.
625 The simulation environment is implemented in a Python environment with
the package called Casymda [100]. Casymda enables SimPy [101] discrete sim-
ulation models to be created with the help of BPMN designed using Camunda
Modeler [102]. Camunda Modeler helps to ease the creation of simulation mod-
els containing more complex processes. By using a defined subset of avail-
630 able BPMN elements combined with naming conventions, model graphs can
be created to derive corresponding Python modules from base templates. The
model graph is used to define the instantiation of model objects and predeces-
sor/successor relationships between these objects. The initialization of work-
in-progress (WIP) in the simulation model is realized using entry points of the
635 model elements. It accounts for seized resources and the processing time that
has already passed following the start of the simulation by comparing an entity’s
entry timestamp to the starting time of the simulation. The entities follow their
normal processing flow [103].
The constant monitoring and control function is implemented by detecting
640 information traces through process mining. In order to evaluate the results,
indicators were defined concerning, e.g. time, cost, performance, utilization.
Four types of performance indicators were used: the mean waiting time of the
entities before being processed by the resources, the values of stock waiting to
30
be processed, capacity, and utilization of the resources.
645 It should be noted that given the demands of continuous improvement, this
workflow should return to the first stage.
655 • Programming: There are programmers who write CAM programs for CNC
milling. Their processes are described in Figure 5 from P 20 to P 23. Usu-
ally, five programmers work during each shift.
• Cutting plant: In the cutting plant, the raw materials are cut to the desired
dimensions for CNC milling. These processes are described in Figure 5
660 from P 12 to P 14, where the walking processes are also indicated. On
average, two operators work during each shift.
• Milling plant: CNC milling operators handle the CNC machines. Their
processes are described in Figure 5 from P 24 to P 42 besides the walk-
ing processes. Typically, six operators work during each shift and are
665 individually assigned to work on one of the eight CNC machines available.
31
Figure 5: Process overview
to be used for milling. Currently, 30% of the CAM programs are made by CNC
operators at the manufacturing plant. As soon as the CAM program is available
for production, the CNC operators can commence their work.
675 As the processes concerning the third stage of the workflow have been
thought through, many improvements can be recognized. BPMN-modeling,
so the understanding of the ’as-is’ state in collaboration with the company’s
process engineers help a lot in figuring out the possibilities for process improve-
ment. These activities help to understand the reasons behind their working
680 and operating strategy, and they help to identify the bottlenecks. On the other
hand, BPMN-based simulators can provide a systematic solution to identify the
bottlenecks of the system [99]. In our case, the following improvements are
investigated by evaluating different KPIs:
• First case: We believe the CNC operators should not spend their valuable
685 time writing CAM programs since this would decrease the utilization of
the CNC machines. Programmers should write all the CAM programs.
• Second case: While the CNC machine is being operated, the CNC oper-
32
ators can handle other production tasks using other available CNC ma-
chines instead of being idle.
690 • Third case: The third possibility is to decrease the non-value added activ-
ities of the operators, so AGVs (automatic guided vehicles) can transport
raw materials and products instead of the operators. Furthermore, the
virtualization of the drawings could be considered, so sheets of paper do
not need to be handled.
33
Figure 6: The box-plots of the difference times between the measured and estimated milling
times by the operators
with the box-plots shown in Figure 6. It is worth noticing that the difference is
positive, so the measured milling times are higher than the estimated times.
720 In the following, we investigated how the difference between the measured
and estimated milling times varies with the estimated milling times and with
the number of the required tools for milling, as it is presented in Figure 7.
As can be seen, the residuals increase both with the estimated milling times
and the number of required tools. Hence both variables are considered in the
725 prediction. Since the residuals are not independent of the operators, two linear
models were developed to predict milling times. However, the coefficient of the
variable presenting the number of tools in case of Operator 1 is 0.01 and its
p-value is 0.8 (> 0.05), while in case of Operator 2 the coefficient is 0.3 and its
p-value is 0.02 (< 0.05). It means that the number of required tools affected
730 only the milling time in the case of Operator 2.
An applicable result for the company is that the measured milling times are
not independent of the number of the required tools. It comes from the fact that
the tool shelf does not have enough slots to store all the necessary tools, and
34
Figure 7: Analysis and prediction of milling times
35
750 grammers. The second case is simulated by queuing the appropriate resources
in the proper order. In the simulation, the entities are first queue CNC ma-
chine as a resource. Then, if available, an operator is queued to perform the
necessary processes. Then if CNC milling is the process to be performed (P 36),
the operator is released to undertake other processes. If the milling process is
755 conducted, an operator is required to continue the manufacturing process. The
third case is simulated by adjusting the walking time to zero minutes. In this
case, it is assumed that an AGV is always available to transport materials, and
the transportation processes do not hinder other processes.
Process mining techniques can be applied to validate and verify the devel-
760 oped process model and simulator by analyzing a log file generated with the
developed simulator. Process mining is a powerful tool to verify that the ob-
tained results and analysis are consistent with the present ’as-is’ and the desired
’to-be’ systems. The log file generated from the simulation holds information
about the events, processes and communication messages that enable the sim-
765 ulated process to be reconstructed and its operations investigated.
Concerning the sixth stage, we have presented the investigated KPIs in each
case, and it can be seen that all the suggestions improve the performance of the
manufacturing unit.
By evaluating step-by-step the KPIs, we can gain valuable information about
770 the processes and their structures. In the first case, all CAM programs are writ-
ten by the CAM programmers resulting in a slight increase in KP I2, therefore,
an increase in the quantity produced per shift. As a consequence, the aver-
age waiting time of entities (KP I1) to be processed by CAM programmers
increased, so the resource utilization (KP I3) and the stock quantity (KP I4).
775 The average waiting time is about 45 minutes, the average stock is eight pieces,
and the degree of utilization per shift is 50%. In the second case, the milling
operators can handle more CNC machines, which would suggest a significant
increase in capacity. The quantity produced per shift rose from 25 to 35 to
be exact. As shown from KP I1, the operators are not subjected to signifi-
780 cant waiting times, unlike the CNC machines, which means that the number of
36
Figure 8: KPIs of the different cases (KP I1: Average mean time, KP I2: produced quantity,
KP I3: Resource u
CNC machines is the relevant constraint. In the third case, the materials are
transported by an AGV robot and all information is visualized. It increases the
production capacity slightly and reduces the waste time so the cutting opera-
tors can finish their tasks earlier, as seen from KP I4. However, this leads to
785 an incremental change in KP I1 and KP I4 at CAM programmers. Following
the suggested process improvements, we can increase the production capacity,
which our simulation has demonstrated.
37
1. Process identification: The combination of the observation method, in-
depth interviews and the analysis of databases (product tracking system
and CAM files) allowed a common understanding of the current processes
as well as the structure and operation of the automated factory unit to be
795 gained.
2. Process discovery: The BPMN graphical notion of the ’as-is’ model
brings about a transparent visualization and logical structure to the con-
cept.
3. Process analysis: The extracted data is analysed by data and process
800 mining tools. The impact of operator on the CNC milling times was
analysed by ANOVA. The model was optimized according to the defined
KPIs (e.g. time, cost, utilization).
4. Process redesign: The BPMN model of the ’to-be’ model is built to
ensure transparency. The process model supported the specification of
805 the MES and defined which characteristics we aimed to measure.
5. Process implementation: The BPMN-based simulation model of the
’to-be’ stage is built.The simulation model is based on BPMN, and param-
eters concerning time, costs, and resources can be added. The simulation
reveals logical and structural efficiency and provides information about
810 the utilization and operators of business processes. Furthermore, it allows
possible modifications to be tested before putting them into practice. The
simulation model revealed information about the steps and supported the
identification of which variables should be measured.
6. Process monitoring and control: The results are evaluated by com-
815 paring KPIs and revising the model. Further and continuous innovations
and the revision of the processes are essential.
38
This project phase covers the six stages of the BPM life cycle: identification,
determination, analysis, redesign, implementation (simulation) and the constant
monitoring of processes. As a result, an outline of the desired business process is
820 determined. The simulation model enabled the processes to be brought to life by
adding time, costs, and resources parameters to the model. Later in this project,
these attributes will be added to the concept, along with the analysis of log files
used to validate its efficiency. Thereby, the evaluation of current processes is
possible, and estimations can be made in advance. The model must be capable
825 of handling and cooperating with information systems; moreover, data exchange
based on standards must be considered. The integration and implementation of
the concept is a complex task that requires further examination and constant
evaluation.
3.3. Discussion
830 The proposed methodology guides managing business processes in the age
of Industry 4.0. The application of the methodology requires the capability
to adapt to I4.0 technologies and organizational changes. To implement the
I4.0 concept, companies must address significant challenges as well as develop
internal and external capabilities, and processes [104]. Organizations must be
835 aware of their I4.0 maturity level in order to facilitate the understanding and
evaluate actual business state, process architectures (’as-is’ model), as well as
accurately determine development strategies [48].
Maturity models reveal key focus areas as ’Strategy’, ’Operation’, ’Products’,
’People’, ’Technology’ and ’Data and information’. An organizational strategy
840 is what propels this transformation process towards the concept of I4.0. Oper-
ation refers to vertical and horizontal value-chain integration, which cannot be
managed without an advanced technological background to share and exchange
information. The result is a network of cross-enterprise planning and control of
the entire product life cycle. Moreover, the integration of the production sys-
845 tem offers the potentials to enhance productivity, quality and flexibility [105].
However, people cannot be neglected as they are the key drivers of knowledge.
39
Continuous training of the workforce is needed to meet new skill-set require-
ments. Technological adaptations and changes across the organization demand
complex and cross-functional skills and abilities such as logical and mathemati-
850 cal reasoning, flexibility and creativity, problem sensitivity, active learning crit-
ical thinking, coordination and negotiation skills, digital and analytical skills,
technical and management skills, systems thinking and complex problem-solving
skills [106]. These skill-sets enables the workforce to have high flexibility and
faster adaptation to the continuously changing business environment, critically
855 analyze data- and knowledge-driven processes and integrate new technological
developments, tools and methods.
The continuous development and optimization procedures require a wide
range of knowledge and methods to be applied to gain a comprehensive insight
into the operation of business processes. The data-driven redesign of business
860 processes is resource-intensive due to high complexity data and expects an in-
tegrated data management system to ensure overall data availability within
the organization. Although process mining is an efficient tool for discovering
business processes to derive BPMN models, some requirements need to be ad-
dressed. Such as data reliability, correctness and completeness, as well as event
865 logs should be recorded in a way to ensure privacy and security [53].
We believe that organizations that strive to succeed in the I4.0 environment
can utilize this concept to gain a shared understanding of the interrelation be-
tween BPM, business data analytics and data-driven technologies. Furthermore,
researchers in the field may deem the development of the business process man-
870 agement life cycle via I4.0-driven and data-based decision support tools to be
suitable.
40
4. Conclusion
41
tasks that need to be done and those that are derived within a given time
period. Information stored in the MES enabled data to be identified, discovered
and analyzed in real-time. Furthermore, to continuously monitor and develop
905 the operation of the autonomous factory unit, optimization algorithms must be
applied.
This paper highlighted that the emerging technologies connected to the BPM
life cycle and the CRISP-ML concept serve the same goal: integrating I4.0 devel-
opments and business analytics into the business process management phases.
910 Innovative projects such as creating an automated factory unit cannot be re-
alized without AI and ML technologies and decision support tools, including
modeling, augmented data management and analytical techniques. In the fu-
ture, AI and ML applications will gain greater importance in addition to an-
alytical techniques and decision intelligence, including modeling. These trends
915 and techniques play a significant role in predicting and optimizing business pro-
cesses and efficiently realizing innovative projects. Consequently, organizational
business processes in the age of I4.0 must be supported with novel tools to adapt
to industry trends and rebound business agility.
5. Acknowledgment
920 This work was supported by the TKP2020-NKA-10 project financed under
the 2020-4.1.1-TKP2020 Thematic Excellence Programme by the National Re-
search, Development and Innovation Fund of Hungary. The research was also
supported by the 2019-1.1.1-PIACI-KFI-2019-00312 project (Mobilized collab-
orative robot-based development of a modular Industry 4.0 production system
925 with quality management functions).
42
References
950 [8] L. Gerlitz, et al., Design management as a domain of smart and sus-
tainable enterprise: business modelling for innovation and smart growth
in Industry 4.0, Entrepreneurship and Sustainability Issues 3 (3) (2016)
244–268.
43
[9] G. Tsakalidis, K. Vergidis, G. Kougka, A. Gounaris, Eligibility of BPMN
955 models for business process redesign, Information 10 (7) (2019) 1–14.
[17] A. Schumacher, S. Erol, W. Sihn, et al., A maturity model for assessing In-
975 dustry 4.0 readiness and maturity of manufacturing enterprises, Procedia
CIRP 52 (1) (2016) 161–166.
44
980 [19] I. Grangel-González, P. Baptista, L. Halilaj, S. Lohmann, M.-E. Vidal,
C. Mader, S. Auer, The industry 4.0 standards landscape from a semantic
integration perspective, in: 2017 22nd IEEE International Conference on
Emerging Technologies and Factory Automation (ETFA), IEEE, 2017, pp.
1–8.
45
Advances in Production Management Systems, Springer, 2015, pp. 467–
475.
1020 [30] M. Zur Muehlen, M. Indulska, Modeling languages for business processes
and business rules: A representational analysis, Information Systems
35 (4) (2010) 379–390.
46
[35] M. Dumas, M. La Rosa, J. Mendling, H. A. Reijers, Introduction to busi-
1035 ness process management, in: Fundamentals of Business Process Manage-
ment, Springer, 2018, pp. 1–33.
[38] T. Ahmad, A. Van Looy, Business process management and digital inno-
1045 vations: A systematic literature review, Sustainability 12 (17) (2020).
[39] M. Rosemann, Proposals for future BPM research directions, in: Asia-
Pacific Conference on Business Process Management, Springer, 2014, pp.
1–15.
[40] F. Martinez, Process excellence the key for digitalisation, Business Process
1050 Management Journal 25 (7) (2019) 1716–1733. doi:https://doi.org/
10.1108/BPMJ-08-2018-0237.
1055 [42] W. Bandara, A. Van Looy, J. Merideth, L. Meyers, Holistic Guidelines for
Selecting and Adapting BPM Maturity Models (BPM MMs), in: Interna-
tional Conference on Business Process Management, Springer, 2020, pp.
263–278.
47
[44] D. Paschek, C. T. Luminosu, A. Draghici, Automated business process
management–in times of digital transformation using machine learning or
artificial intelligence, in: MATEC Web of Conferences, Vol. 121, EDP
1065 Sciences, 2017, p. 04007.
[46] W. Van Der Aalst, Data science in action, in: Process mining, Springer,
1070 2016, pp. 3–23.
[47] J. Griffin, Developing strategic KPIs for your BPM system, Information
Management 14 (10) (2004).
1080 [51] M. Kerremans, Market Guide for Technologies Supporting a DTO, Gart-
ner Inc (2018).
[52] L. Reinkemeyer, Process Mining, RPA, BPM, and DTO, in: Process Min-
ing in Action, Springer, 2020, pp. 41–44.
48
[54] C. Flechsig, J. Lohmer, R. Lasch, Realizing the Full Potential of Robotic
Process Automation Through a Combination with BPM, in: Logistics
1090 Management, Springer, 2019, pp. 104–119.
1100 [59] L. Dezi, G. Santoro, H. Gabteni, A. C. Pellicelli, The role of big data
in shaping ambidextrous business process management, Business Process
Management Journal 24 (5) (2018) 1163–1175. doi:https://doi.org/
10.1108/BPMJ-07-2017-0215.
49
[63] J. I. Olszewska, M. Houghtaling, P. Goncalves, T. Haidegger, N. Fabiano,
1115 J. L. Carbonera, S. R. Fiorini, E. Prestes, Robotic ontological standard
development life cycle, in: IEEE ICRA 2018 WELCARO workshop, 2018.
[64] I. SA, Ieee 1872-2015 - ieee standard ontologies for robotics and automa-
tion.
URL https://standards.ieee.org/standard/1872-2015.html
50
national Conference on Enabling Technologies: Infrastructure for Collab-
orative Enterprises (WETICE), IEEE, 2016, pp. 152–157.
1155 [75] D. Mourtzis, E. Vlachou, N. Milas, Industrial Big Data as a result of IoT
adoption in manufacturing, Procedia CIRP 55 (2016) 290–295.
51
[79] J. Geyer-Klingeberg, J. Nakladal, F. Baldauf, F. Veit, Process Mining
and Robotic Process Automation: A Perfect Match., in: BPM (Disserta-
1170 tion/Demos/Industry), 2018, pp. 124–131.
1175 [81] O. M. Group, About the Business Process Model And Notation Specifi-
cation Version 2.0.
URL https://www.omg.org/spec/BPMN/2.0/
[83] A. McAfee, E. Brynjolfsson, Race against the machine: How the digital
revolution is accelerating innovation, driving productivity, and irreversibly
1185 transforming employment and the economy (2011).
52
[88] E. Hozdić, P. Butala, Concept of socio-cyber-physical work systems for
Industry 4.0, Tehnički Vjesnik 27 (2) (2020) 399–410.
1200 [90] S. F. Wamba, D. Mishra, Big data integration with business processes:
a literature review, Business Process Management Journal 23 (3) (2017)
477–492. doi:https://doi.org/10.1108/BPMJ-02-2017-0047.
[92] J. Han, J. Pei, M. Kamber, Data mining: concepts and techniques, Else-
vier, 2011.
1215 [95] P. Chapman, Julian clinton (spss), randy kerber (ncr), thomas khabaza
(spss), thomas reinartz (daimlerchrysler), colin shearer (spss) and rüdiger
wirth (daimlerchrysler),”, CRISP-DM 1.0. Step-by-step data mining guide
(1999).
53
[97] C. Cimino, E. Negri, L. Fumagalli, Review of digital twin applications in
manufacturing, Computers in Industry 113 (2019) 103130.
1235 [102] A. Fernandez, Camunda BPM platform loan assessment process lab, Bris-
bane, Australia: Queensland University of Technology (2013).
54
[107] Gartner, Gartner top 10 trends in data and analytics for 2020.
URL https://www.gartner.com/smarterwithgartner/
1250 gartner-top-10-trends-in-data-and-analytics-for-2020/
55