Open Source Framework For The Deisgn Development and Integration of Effective Digital Twin PDF
Open Source Framework For The Deisgn Development and Integration of Effective Digital Twin PDF
Abstract
arXiv:2301.05560v1 [cs.SE] 12 Jan 2023
Although digital twins have recently emerged as a clear alternative for reliable asset representations,
most of the solutions and tools available for the development of digital twins are tailored to specific
environments. Furthermore, achieving reliable digital twins often requires the orchestration of tech-
nologies and paradigms such as machine learning, the Internet of Things, and 3D visualization, which
are rarely seamlessly aligned. In this paper, we present a generic framework for the development of
effective digital twins combining some of the aforementioned areas. In this open framework, digital
twins can be easily developed and orchestrated with 3D connected visualizations, IoT data streams,
and real-time machine-learning predictions. To demonstrate the feasibility of the framework, a use
case in the Petrochemical Industry 4.0 has been developed.
2
[10, 11]. Therefore, application developers may A digital twin development guide has been
struggle to find digital twin frameworks in the built around the FIWARE ecosystem in [7]. FI-
Industry 4.0, especially open-source tools for the WARE is a well-known ecosystem promoted by
development of effective 3D-IoT-AI-powered dig- the European Commission for the development
ital twins. Some of the most relevant works in of next-gen applications in multiple sectors. FI-
this field are summarized below. WARE has the open-source components in place
Kamath et al. [12] aim to bridge the gap be- for data ingestion, data analysis, authentication,
tween academia and industry by providing an ar- and data storage, among others, for the service
chitecture for the development of digital twins realization. The solution describes a guide of
based on open-source components. Our archi- how digital twins can be defined in FIWARE, but
tecture shares some similarities with their study, a framework adapted for digital twins has not
as some of the open-source technologies used, been developed and validated as in our work.
such as Kafka, InfluxDB, and Eclipse Ditto, are Khan et al. [15] propose a spiral digital
also adopted in this work. However, our archi- twin framework, which aims at secure and re-
tecture goes further, as it allows the integration liable management of digital twin data through
not only of digital twins with real-time data moni- blockchain. This framework aims at providing
toring but also with machine learning techniques a continuous and consistent synchronization be-
(Kafka-ML) and 3D rendering. Finally, in our tween the digital twin and the real assets. A
architecture, in addition to using open compo- new blockchain solution, twinchain, is proposed
nents, the whole digital twin platform and re- to address the problems of merging digital twins
lated components are available on GitHub2 . with the blockchain technology. The spiral digi-
Karan et al. [13] also consider the inclusion tal twin framework focuses on improving the re-
of open tools such as Eclipse Ditto and OpenPLC liability of digital twin information and does not
for the realization of a digital twin framework. provide a general-purpose framework for the de-
They orchestrate a simulation model for mod- velopment of effective digital twins as our work
elling Computational Fluid Dynamics (CFD) of a does. Although communications within our com-
portable temperature-controlled chamber. Even ponents are encrypted, we will also look into
though the integration of simulation models is adopting blockchain to improve the reliability of
promising for combining real-world monitoring the system in the future.
data assets and physical models, this framework
lacks flexibility and is only adapted to CFD sys-
3. Open-source architecture for the design
tems.
and development of 3D IoT-AI-powered
A modular digital twin framework is presented
digital twins
in [14]. This framework enables the monitoring,
assessing, and 3D visualization of assets and has The fundamental pillar of the architecture
been validated on a real manufacturing scenario. is the well-known framework for digital twins
Some KIPs can be defined for measurement and Eclipse Ditto. Eclipse Ditto offers an entity to
alert generation through LabVIEW. The main lim- model twins, the storage of the state of the twins
itation of this architecture is that all the ele- and their events, an access control system, and
ments are connected to a single OpenPLC server. the support of different types of connections that
As evaluated in the article, scalability can be a allow interaction with the twins and with other
challenge in this architecture, and a possible so- backends. Incoming messages to Eclipse Ditto,
lution is the replication of the framework struc- except those from the HTTP API, must follow
ture for each process/machine to be monitored the Ditto Protocol3 format, or a payload map-
and modelled. Instead, our architecture is en- ping must be configured in the connection. All
tirely based on fault-tolerant containers that are the necessary elements in our architecture have
managed in real time through Kubernetes. been connected around Eclipse Ditto, both to
2 3
https://github.com/ertis-research/digital-twins-platform https://www.eclipse.org/ditto/protocol-overview.html
3
make up for its shortcomings in terms of what a gins developed, (which, in addition to the code
digital twin essentially requires and to extend its produced, also have their own installation man-
functionality. All these elements are open-source ual and documentation of use) can be found.
tools, and, although most of them are external The discussion of the architecture has been or-
projects, some of the services have been specif- ganized according to the main milestones, sepa-
ically developed to add certain functionality or rating it into its basic functionality as a digital
to support the connection of incompatible ele- twin platform, the data prediction with machine
ments. learning, and the 3D representation of the state
The result, of which we can see an overview of the twin.
in Figure 1, is a microservices architecture that
covers the basic needs of a digital twin plat- 3.1. Essential functionality
form, such as device connection, real-time data The objective of this milestone was to obtain a
storage, and data visualization, but that really platform in which the digital twin of any element
stands out for other functionalities, such as the made up of sensors can be defined. For this, it is
real-time data streams prediction through ma- essential the definition of the twin, to obtain and
chine learning and the possibility of displaying connect the IoT information, the storage of the
the state of the twin by means of interactive 3D twin data in real time series, and the possibility
representations. of consulting these data in a user-friendly way.
The microservices architecture has a great Although more secondary, it is also convenient
modularity, reusability and scalability. Each of to include the creation and management of twin
its components is independent and is responsible types. This streamlines the tedious task of cre-
for a specific function. This makes it possible to ating multiple twins that, while corresponding to
replace them easily without affecting the rest of different physical devices, have exactly the same
the system, reuse some of these services in other features. For example, if we have ten identical
projects, and extend the system with less diffi- sensors in the real world, we can define just one
culty. Regarding communication between ser- type and create ten twins for them.
vices, most of them have an API (Application Pro- The blue components in the architecture (Fig-
gramming Interface) to facilitate interaction, al- ure 1) represent the part of the architecture cor-
though when it comes to receiving and sending responding to this essential functionality. This
real-time data, other pub-sub protocols, such as part is mainly composed of open-source projects
AMQP and Apache Kafka, are considered. but also includes two elements that have been
To ease the management of all the services in developed to complete the desired functionality.
the architecture, services have been packaged in The main element is Eclipse Ditto, as men-
containers, and a container orchestrator coordi- tioned an open-source framework for building
nates them. In this case, Docker has been used digital twins. Eclipse Ditto does not provide any
as the container technology and Kubernetes as system to obtain the information sent by the de-
the container orchestrator. In each of the ex- vices, so Eclipse Hono5 will be used for this pur-
ternal projects, its version for Docker contain- pose. Eclipse Hono is a platform that provides
ers has been chosen, and all the developed ser- several interfaces for connecting a large num-
vices have been containerized. Moreover, encap- ber of IoT devices, unifying them into a single
sulating each service in an isolated environment AMQP 1.0 endpoint, where the information re-
ensures its portability and correct execution in ceived can be read, and commands can be sent
most systems. to trigger actions on any of the devices. It can
In Github, 4 the description of the system, the receive information via common IoT protocols,
installation and connection manual of the archi- such as MQTT, AMQP, HTTP, and CoAP, and cus-
tecture, the necessary documentation for its use, tom adapters. This is the recommended tool
and the redirections to all the services and plu- to work with Eclipse Ditto, and, thanks to the
4 5
https://github.com/ertis-research/digital-twins-platform https://www.eclipse.org/hono/
4
Figure 1: An overview of the 3D-IoT-AI-powered digital twin architecture. In blue the essential functions, in red the 3D repre-
sentation and, in yellow the data prediction with ML.
Eclipse cloud2edge package6 , the integration of Telegraf8 , a plugin-driven server that provides
the two is very convenient. support for a large number of data sources and
On the one hand, the Eclipse Ditto Thing en- with which we can configure the data ingestion
tity can be used in different ways, so after study- into the database. As there is no possibility for
ing different options, such as considering a Ditto Telegraf to collect the data directly from Eclipse
Thing as a complete twin including each sensor Ditto, because neither technology implements a
as a feature of it, a design decision has been broker of the protocols it supports, we need an
made to assign a Ditto Thing to a single entity intermediate element that allows its connection.
or sensor and to create parent-child hierarchies Apache Kafka9 , one of the best-known streaming
between these entities. These have been imple- and processing platforms for real-time data, is a
mented in a service called Ditto-Extended-API, very good alternative since Telegraf has a Kafka-
which can be considered as a layer above Eclipse consumer10 plugin available, and Eclipse Ditto
Ditto and which provides an API that replaces does have got the option to publish events in a
the one offered by this technology. In addition to Kafka topic.
verifying that the specified constraints are satis- Finally, Grafana11 has been chosen to act
fied, this service adds all the respective function- as the front-end, i.e., the user interface for
ality to the twin types allowing both their man- end-users. This technology provides support
agement and the creation of twins from them. for metrics visualization from the most popular
On the other hand, another feature that databases, including InfluxDB. It allows making
Eclipse Ditto lacks is the storage of the twin queries in the language defined by the chosen
state at different time instants. To solve this, data source and presenting the result in differ-
InfluxDB7 has been chosen as a time-series ent types of interactive panels. These panels
database. This is a well-known database with a
large community and is ideal for processing sen- 8
https://www.influxdata.com/time-series-
sor data. To collect the data, we have considered platform/telegraf/
9
https://kafka.apache.org/
10
https://www.influxdata.com/integration/kafka-telegraf-
6
https://www.eclipse.org/packages/packages/cloud2edge/ integration/
7 11
https://www.influxdata.com/products/influxdb-overview/ https://grafana.com/
5
are part of dashboards, which can be modified to users to have a fine-control of the ingestion data
the user’s liking. It also includes an access con- in popular ML frameworks such as TensorFlow
trol system through roles. Another of its strong and PyTorch. Kafka-ML currently supports pop-
points, and one of the most important for the ular ML frameworks such as TensorFlow and Py-
project, is that it allows the creation of person- Torch and through its user-friendly Web UI al-
alized panels and the inclusion of any type of lows the management and deployment of ML
functionality by means of plugins, providing the models, from their definition to final deployment
libraries and documentation necessary for this. for inference.
A plugin app12 , called Digital Twins, has been One of the main purposes in the area of ma-
developed for Grafana. This plugin adds a graph- chine learning focused on digital twins is the pre-
ical interface for managing twins and types in diction of the future states of the twin, which
the same application that is already used in the may be of considerable utility if control or im-
front-end for querying the status of the twins, provement of the element that the twin repre-
thus unifying all the functionalities of the plat- sents is sought. It also aims to predict certain
form. Its functionality consists, basically, in mak- features or values of the twin that cannot be
ing calls to the Ditto-Extended-API service in a measured or obtained directly or in any accu-
user-friendly way. rate way. As mentioned above, these models will
be deployed in Kafka-ML, so for their connec-
3.2. Data prediction with Machine Learning tion with Eclipse Hono a service has been de-
This milestone aims to achieve the integra- veloped, which has been named Eclipse-Hono-to-
tion of the platform with machine learning algo- Kafka-ML 14 . This service constantly reads from
rithms. This might be useful for digital twins to the Eclipse Hono endpoints corresponding to the
predict their next state or a situation of failure tenants (entity that allows the logical partition-
as, for instance, the values that a sensor should ing of devices into unique groups) that contain
return in case no real data are received from it, devices whose data must be sent to these de-
either because it has been switched off or be- ployed Kafka-ML models. When a message ar-
cause it has got some kind of failure. rives from one of these devices, the service pro-
The part of the architecture that is in charge cesses the data to comply with the required for-
of achieving this objective corresponds to the mat and automatically sends it to the respective
yellow components in Figure 1. In this part, Kafka-ML input topic, where the model will pro-
the main component is Kafka-ML [16], which cess the data to obtain a prediction as a result.
will be in charge of machine learning life cy- Then, the digital twin is updated with the predic-
cle management and complements this archi- tion. Remarkably, when using the prediction of
tecture. In addition, in order to integrate it future states of the twin, the main digital twin is
with Eclipse Hono and Eclipse Ditto and ful- not updated, but a copy of it is made in Eclipse
fil the required functionality, three specific ser- Ditto containing the predicted state. This copy
vices have been developed: Eclipse-Hono-to- will be considered the same twin but advanced a
Kafka-ML, Error-Detection-for-Hono-with-Kafka- certain time.
ML and Kafka-ML-to-Eclipse-Ditto. Another of the strong points of this part is to
Kafka-ML is an open-source framework 13 de- detect when a sensor is not sending its data and
veloped by our group that manages the life cy- to act in consequence. To this end, a new ser-
cle of ML/AI applications in production environ- vice Error-Detection-for-Hono-with-Kafka-ML 15
ments through continuous data streams. Unlike has been created, which basically reads the in-
traditional frameworks that work on datasets or formation received by the specified Hono tenants
static files, Kafka-ML allows both training and in- and checks that the devices that are part of it
ference with continuous data streams, enabling
14
https://github.com/ertis-research/eclipse-hono-to-kafka-
12
https://github.com/ertis-research/digital-twins-plugin- ml
15
for-grafana/ https://github.com/ertis-research/error-detection-for-
13
https://github.com/ertis-research/kafka-ml/ eclipse-hono-with-kafka-ml/
6
are sending their data in accordance with their Unity is a software that mainly focuses on
periodicity. In case one of them does not send video game development, although it can also
the data when it is due, the service will send the be used in other contexts. It is one of the most
last values received from that sensor plus other popular graphics engines with the largest com-
necessary data, such as the date, to a Kafka- munity. Although it is not an open-source tool,
ML trained model with historical data. Kafka-ML it can be used free of charge for personal use
will predict the next state of the system to avoid or for low-budget projects. This technology al-
a service interruption due to the sensor failure lows assigning a certain behaviour to 3D objects
until the sensor is available again. through the use of scripts and interacting both
In the other direction, the data generated by with the user and with other elements in the en-
Kafka-ML have to be consumed by Eclipse Ditto. vironment. A wide variety of formats are avail-
Initially, the idea was to take advantage of the able for importing 3D models, including Blender.
payload mapping functionality provided by Ditto Blender is an open-source 3D creation suite that
to Kafka-ML, creating a source connection to also has a large community and will be of help
each of the Kafka-ML output topics where mod- when creating or modifying 3D objects. Unity
els send the predictions, and mapping the infor- also allows the project to be built in several for-
mation received so that it could be supported mats, including a specific one for web rendering
by Eclipse Ditto. This was not viable, since, at called Unity WebGL. Compilation in this format
the time of the development of this platform, will also be of essential importance in this part
Eclipse Ditto had not implemented the connec- of the architecture.
tion with Apache Kafka acting as a data source.
That is why it was decided to build an interme- Having the 3D representation of the twin built
diate service 16 that would read the information in this format, the next step is to run it in
from Kafka, map it to Eclipse Ditto Protocol, and Grafana, the tool we have chosen as the front-
publish it to a message broker that Eclipse Ditto end of the platform. To do this, we developed
could connect to, in this case RabbitMQ17 , which a panel plugin. This type of plugin allows the
use AMQP 0.9.1. creation of a custom panel that can be included
in any dashboard. It usually represents or uses
3.3. 3D representation of the state of the twin data series, which will depend on the query and
An important aspect of digital twin platforms the data source introduced by the user in the
is the representation of the data. In the most panel configuration. Additionally, it can be im-
relevant industrial platforms, it is common to plemented in such a way that it provides the user
find 3D representations of the twin that make it with a series of custom options in its configura-
much easier to understand its information and tion by means of which its behaviour or visual-
the component that is being consulted at any ization can be easily modified.
given moment. Achieving this 3D representa-
As well as rendering a Unity WebGL build-
tion of the twin is the main objective of this mile-
ing, the plugin panel must allow interaction with
stone.
Grafana and with the user in both directions. In
The architecture that has been designed for
other words, the user’s interaction with Grafana
3D visualization corresponds to the red compo-
will be reflected in the 3D model, and, in turn,
nents in Figure 1 and basically consists of the
the user’s interaction with the representation
creation of a panel plugin for Grafana that al-
may have repercussions on the information dis-
lows the display of a 3D model developed with
played by other panels of the dashboard. This
Unity18 with which, moreover, it will be possible
plugin is called Unity panel plugin.19 .
to interact in both directions.
16
https://github.com/ertis-research/kafka-ml-to-eclipse-
ditto/
17 19
https://www.rabbitmq.com/ https://github.com/ertis-research/unity-plugin-for-
18
https://unity.com/ grafana/
7
4. Development and implementation
20 21
https://helm.sh/ https://github.com/yahoo/CMAK
8
kers. When defining it, it is necessary to specify panels, which will display the result obtained.
which data topics to subscribe to. In our case, it First, the Grafana chart has been installed, acti-
was the events of the twin, since Ditto launches vating persistence. Once the tool is ready, in the
one after each update and includes in the mes- same interface, InfluxDB has been configured as
sage the new state of the twin. It was also nec- the data source, indicating that it is the version
essary to specify in the authorization context a that uses Flux as the query language and filling
valid user that has at least read permissions on in the rest of the requested data. Once the tool
the twin. We can consult the existing users and approves the connection, we can now represent
their permissions in the associated policy. Once the twin information in Grafana, thus finishing
the connection is created in Eclipse Ditto, every the connection of all the existing technologies
time the twin is updated, its new status will be and moving on to the development of the extra
published in the Kafka topic we created. functionalities.
To install InfluxDB, the Helm version has also
been chosen. NodePort has been set as service 4.2. Development of Ditto-Extended-API
type and a specific password has been assigned Eclipse Ditto offers us the Ditto Thing entity,
to the administrator, thus avoiding future prob- which always belongs to a namespace and is ba-
lems if the package needs to be restarted. Dur- sically composed of an identifier and a series of
ing installation, it was also necessary to create attributes and features. The attributes corre-
a persistent volume related to a custom stor- spond to the static part of the entity, whereas the
age class. Although InfluxDB provides a very features correspond to the dynamic part. In this
intuitive interface, which is the one that will be way, this technology gives us full freedom to de-
mainly used, it is interesting to install and con- fine how we want our models to be and how we
figure also the Influx CLI for certain exclusive want to use the tool. If a Ditto thing corresponds
functionalities such as user management. to a single sensor, we can store manufacturing
Once InfluxDB was configured, we had to col- information, for example, as attributes, whereas
lect the data exposed in the Kafka topic created the data it receives will be perceived as features.
earlier. For this purpose, Telegraf is used, a tool Another option is that the Ditto Thing entity en-
that was also installed using Helm. To indicate capsulates a set of sensors, separating the data
the selected plugins and their configuration to of each one within its features.
Telegraf, it is required the use of a custom file In order to understand the design decision
of values in YAML format during the installa- taken, it is interesting to first clarify certain con-
tion. In this file, in addition to the parameters cepts about digital twins. A digital twin can be
corresponding to the deployment in Kubernetes, considered both an entity that receives informa-
there is a configuration section. At this point, it tion from a single device, and one that is com-
is essential to review the tool’s documentation posed of other entities, which can also be under-
to see what parameters are required by each of stood as twins. A twin could then be represented
the plugins, regardless of whether they are in- as a tree, where each leaf is the representation
put or output. In our case, as output we have of a single sensor. Thus, a factory that has three
InfluxDB in its second version, which in one of robots that each have particular sensors could
its fields will require a token with write permis- be contemplated as shown in Figure 3.
sions previously created in InfluxDB. As input, As for the twin types, in the example of the
we have a Kafka consumer, to which we specify factory, if robots 1 and 2 are the same model,
the address of the broker and the topic to which we could create a type to facilitate their cre-
it should subscribe. After applying this configu- ation in case more robots of the same type are
ration, our InfluxDB instance should already be added to the factory. In addition, the sensors
receiving the metrics from Kafka. that compose them should also form their own
Finally, there is the connection between In- types since, for example, sensor 2 (and therefore
fluxDB and Grafana, which will allow queries to sensor 4) can be of the same model as sensor
be made to an InfluxDB database from Grafana 5, even if they belong to different robot types.
9
Figure 3: Example of digital twin represented as a tree.
10
types, only the first option can be selected. The ation of devices in Eclipse Hono. All these ac-
interaction with the Ditto policy entity remains tions will be done through API calls, in order for
as it is, although a query is added that returns the plugin to have as minimal business logic as
a list of all existing policies. Figure 6 shows an possible.
overview of the calls currently provided by the To accomplish this, a tab on the main page will
API. be assigned to each of the entities: twins, types,
The service has been developed with the policies, and connections. This tab will modify
NodeJS framework, whose programming lan- its appearance depending on the parameters re-
guage is JavaScript. This open source framework ceived in the URL. Initially it will display a list
works at runtime and provides great perfor- of all the elements of the entity, although, in the
mance in the development of server-side tools. case of twins and types, it may be restricted to
After the implementation and testing, the ser- elements that do not belong to any other (no par-
vice has been containerized for Docker and de- ents) as they are considered the main entities.
ployed on Kubernetes. It should be noted that All elements may be selected for query, edit or
ES6 has not been used for the implementation delete. When querying a twin or type, the infor-
because it was incompatible with the version of mation of the element will be displayed, as well
Kubernetes used in our cluster. Finally, to use as lists that allow querying its parents and chil-
the service, the URI where Eclipse Ditto is de- dren. When consulting a twin, it will also be pos-
ployed, valid credentials to interact with it and sible to manage its relations with Eclipse Hono
the MongoDB URI for the Ditto policy database and Kafka-ML. For these relations, a valid con-
must be set as environment variables. nection with each technology must be selected.
In the connections section you can create Eclipse
4.3. Digital Twins app plugin for Grafana Hono tenants and connect them, or others al-
ready created, to Eclipse Ditto.
Grafana offers three types of plugins: data
source plugins, panel plugins and application The configuration section will be filled in only
plugins. In this case, the latter option is of in- with fields to enter the addresses and creden-
terest to us, as it will allow us to add a new sec- tials necessary to use the different technologies
tion in the left bar of the Grafana application, that make up the platform. If these fields are not
in which we will have a page with as many tabs filled in, the application will not be able to func-
as necessary. Within each one of them it is pos- tion.
sible to add any type of functionality, although In terms of implementation, Grafana provides
it must be taken into account that Grafana has a template for each type of plugin, which can
certain defined styles, so it is recommended to serve both as a basis and to give an idea of how
use them to maintain consistency.In addition to the plugin works. All plugins have a series of
this page, every plugin includes a configuration essential files so that Grafana can detect the plu-
section, which is also a tabbed page, where we gin and configure it. ReactJS, an open source,
can typically find the plugin readme, the option component-based JavaScript framework, is used
to enable/disable it and, if there are any, all the for coding. This framework’s main purpose is
configuration options necessary for the plugin to to create user interfaces for single page applica-
work. tions. Each tab can be assigned a ReactJS com-
This plugin should act as a front-end to the ponent. In our case this component will be used
platform, providing the user with an interface as a component selector to be able to have the
that brings together the different functionali- different modes of an entity: consulting its ele-
ties of the platform. Specifically, it must al- ments, consulting an element and editing an el-
low the management of digital twins and types ement. This selection will be made by means of
through the Ditto-Extended-API service, the ad- a parameter included in the URL. On the other
ministration of policies and connections directly hand, Grafana provides a series of libraries that
in Eclipse Ditto, the connection with Kafka-ML will allow a better integration with the tool. The
through the two services developed and the cre- use of the grafana/ui library is essential to main-
11
Figure 6: Available API calls in ditto-extended-api.
tain the consistency of the forms and other visual to indicate whether a device is available indefi-
elements of the plugin with respect to Grafana. nitely, unavailable, or to specify how many sec-
Once the plugin has been built correctly, its onds it will be available after receiving its last
code must be added to the Grafana plugins folder message. This is useful for checking that the de-
and activated in the corresponding section of the vice is available before sending a message to it,
application. As soon as the required fields in the but it is not really relevant for our purposes as
configuration have been filled in, the plugin will it cannot be manually configured for the AMQP
be ready for use. and MQTT protocols and, even if it could, the fre-
quency at which each device will send its data is
4.4. Error detection for Eclipse Hono with unknown.
Kafka-ML The functionality sought is just that, the con-
Eclipse Hono allows the collection of data re- stant verification that the device is sending data
ceived from several devices in a single AMQP 1.0 periodically. Furthermore, if no data is received
output, in which they can be consulted in the in the usual time interval, it will be assumed that
endpoint corresponding to the tenant to which the sensor has some kind of error and predicted
the device belongs. This endpoint has the form data will be produced by Machine Learning to
telemetry/tenantid, where tenantid is the tenant cover the lack of information. This error detec-
identifier. The problem with Eclipse Hono is that tion cannot depend on the reception of any spe-
its functionality does not include any way to con- cial message from the sensor, as the service must
trol that certain devices are sending data peri- be susceptible to all types of errors, including
odically. Its closest feature is the ttd field that is those that may cause a lack of connection.
received as a device notification. This allows you For this purpose, a service has been designed
12
that will launch a thread for each Eclipse Hono
tenant to be monitored. Within these tenants,
the devices indicated by the user will be super-
vised. In case no data is received from any of
them, the last values obtained will be sent to
Kafka-ML as well as the information required for
the prediction, following the format specified by
the user. The control of tenants and devices can
be easily deactivated and activated, avoiding the
elimination of the rest of the data. A MongoDB
database will be used to store all the informa-
tion needed to establish the connections and pro-
duce the Kafka-ML entries. The exact content of
the information to be provided for each tenant is
shown in Figure 7.
13
"required_values": [ This service has been implemented using the
{ Flask framework, whose programming language
"format": "float64", is Python, as it provides great facilities when
"name": "$year" creating APIs and, unlike NodeJS, allows multi-
}, threading. The main library is the Python thread-
{ ing library, which provides threads and timers.
"format": "float64", Of the remaining libraries used, the most rel-
"name": "$month" evant are those that allow the different con-
}, nections: kafka-python for the Kafka producer,
{ python-qpid-proton for the AMQP 1.0 receiver
"format": "float64", and Flask-PyMongo for working with the Mon-
"name": "$day" goDB database. Also, remarkable is the Marsh-
}, mallow library that checks that the data sent by
{ the user complies with the specified schema be-
"format": "float64", fore inserting it into the database. Once the im-
"name": "temperature", plementation of the service is finished and after
"last_value": null passing the relevant tests, it has been container-
}, ized for Docker and deployed in Kubernetes.
{
"format": "float64", 4.5. Eclipse Hono to Kafka-ML
"name": "humidity", The aim of the Eclipse Hono to Kafka-ML ser-
"last_value": null vice is to automate the input of sensor data
} for other Machine Learning models deployed in
] Kafka-ML, such as those capable of predicting
future states of the twin or features of the twin
Figure 9: Example of the required_values field that cannot be measured.
The design and implementation of this ser-
vice are practically identical to the Error detec-
and the previous one shall not be taken into ac- tion for Eclipse Hono with Kafka-ML service de-
count as the difference may be disproportionate scribed in the previous section, so its explana-
and inconsistent, so the last set interval shall be tion will be omitted. The design and construc-
maintained. tion of the API, the management of the threads
When sending the Kafka-ML message, the re- and connections and the mapping of the message
quired_values array of the device is consulted received from Eclipse Hono to Kafka-ML are the
and an array is created with the specified ele- same as described above. This could be consid-
ments in the indicated format and sorted in or- ered a simplification of the previous service, as
der of appearance. To indicate a time value, the the difference lies in that it sends a message to
name of the value must be defined as the name the Kafka-ML input topic after each message re-
of the field in the datetime library preceded by ceived from an active device, regardless of the
the symbol $. For example, to indicate the year, time interval between them. For this reason, the
the value name must be $year. For the rest of ditto_message field does not store the last re-
the values, the last values received will be used, ceived values of the properties, as all messages
which must be stored in the same object. Af- received from the indicated devices are mapped
ter construction, this array will be sent as bits and sent directly to the corresponding Kafka-ML
to the Kafka-ML tool. In this way, the values re- input topic.
quired for, for example, a sensor that has to send
an array in float64 format with the format [year, 4.6. Kafka-ML to Eclipse Ditto
month, day, temperature, humidity] would corre- Eclipse Ditto supports receiving messages
spond to the object in Figure 9. through various types of connections. At the
14
time of the development of this platform, the
connection with Kafka was under development,
leaving available the creation of connections
with AMQP, MQTT and HTTP. Concerning the
messages, Eclipse Ditto requires them to be in
Ditto Protocol format in order to understand
what action to execute and on which twin to
perform it. In case the format in which the
connection producer sends messages cannot be
changed, a payload mapper can be applied to
the connection. Eclipse Ditto integrates sev-
eral types of mappers, although the most flexi-
Figure 10: Endpoints of Kafka-ML to Eclipse Ditto.
ble is the JavaScript mapper. On the other hand,
Kafka-ML publishes in the output topic the pre-
dicted values in array format and currently does
not provide any way to set a specific format.
In our case, it is necessary to receive the data
predicted by Kafka-ML so that the state of the
corresponding twin is completely or partially up-
dated. Since the direct connection of Eclipse
Ditto to the Kafka-ML output topic is not possi-
ble, a service is needed that collects the informa-
tion from that topic, transforms it to Ditto Proto-
Figure 11: Entities of Kafka-ML to Eclipse Ditto.
col following a given format and sends it through
one of the connections provided by Eclipse Ditto.
This service can be replaced in the future by di- ceived from Kafka-ML should go. To do so, the
rect connections to the Kafka-ML output topics number corresponding to the position of that
that include their respective JavaScript payload value in the array received from Kafka-ML will
mapper. be indicated between braces. For example, if
The service, called Kafka-ML-to-Eclipse-Ditto, we have a digital twin of a temperature and hu-
has a similar design to the one explained in the midity sensor and Kafka-ML sends us an array
previous section. In general, it consists of creat- whose first value is the predicted temperature
ing a thread for each Kafka-ML output topic from and its second value is the predicted humidity,
which we want to extract information, transform- the ditto_message field would be as shown in
ing it and sending the result to an AMQP 0.9.1 Figure 12.
broker. These threads can be managed through As for the threads, each of them creates a
an API, which, as shown in Figure 10, allows Kafka consumer for the topic to which it corre-
them to be created, queried, deleted, activated sponds and an AMQP connection to a RabbitMQ
and deactivated. The IP of the AMQP broker will broker. Once initialized, the thread will con-
be the same for all threads and will be set in the stantly check the topic for new messages until
service environment variables. Each thread re- it receives an event that forces it to stop and
quires for its operation the information indicated close the connections. After receiving a mes-
in Figure 11, which consists of the necessary sage, the elements mentioned in a copy of the
data for the connection with Kafka and AMQP, given schema will be replaced by the received
the initial state of the thread and the schema of values and the resulting message will be sent via
the message that will be sent to Eclipse Ditto. AMQP to the queue indicated by the user.
This schema will be a JSON object in Ditto Pro- As with the design, the implementation is quite
tocol format, in which the user has to mark in similar to the previous service. Flask, Flask-
the value section the place where the values re- pymongo, threading and Mashmallow are also
15
"ditto_message" : { ferentiate the values coming from Eclipse Hono
"topic": "test/DHT22/things/twin/ from those coming from Kafka-ML, the Telegraf
commands/modify", configuration has been updated to include the
"path": "/features", ditto-originator field of the message header as
"value": { a tag key. After configuring this, creating the
"temperature": { necessary entity and activating the thread, the
"properties": { updates for the twin should be received without
"value": "{0}" any problem.
}
}, 4.7. Unity panel plugin for Grafana
"humidity": {
As previously explained, Grafana acts as the
"properties": {
front-end of the platform. Its main functional-
"value": "{1}"
ity is to create dashboards composed of panels
}
that will represent in a certain way the result of a
}
query to a data source. Most of the default dash-
}
boards available in the tool focus on the repre-
}
sentation of the data by employing some kind of
graph. For the creation of custom dashboards,
Figure 12: Example of the ditto_message field Grafana offers support for the development of
panel plugins. This type of plugin is similar to
the app plugin developed in a previous section,
used for the same purposes as those described with the difference that it offers the data result-
in the implementation of the Error-detection-for- ing from the query entered, and the panel op-
Hono-with-Kafka-ML service. On the other hand, tions selected by the user to build the represen-
the kafka-python library is used in this case to tation.
create a Kafka consumer capable of reading the The goal here is the 3D representation of the
Kafka-ML output topic and the pika library is digital twin and its state. It is also desired to be
used to establish the AMQP 0.9.1 connection. It able to interact with the representation in such a
should be noted, on the other hand, that for the way that this interaction influences the data dis-
deployment of the RabbitMQ instance in Kuber- played on the Grafana dashboard and, likewise,
netes, its version of Helm was used, being nec- that the representation is affected depending on
essary to activate persistence, assign a persis- the state of the twin. At the moment, Grafana
tent volume claim previously created, activate does not have any panel plugin that allows this
the volume permissions and assign the type of functionality. Therefore, to satisfy this require-
service as NodePort. ment, a plugin panel will be created to display a
Once implemented, tested, containerized and 3D model developed in Unity and interact with
deployed, the service can be used. In order for it. Although its development will focus on its use
Eclipse Ditto to be able to receive the generated in the area of digital twins, the aim is to abstract
messages, an AMQP 0.9.1 connection has been its functionality as much as possible to allow its
established between it and RabbitMQ, filling the reusability in other areas.
source address part of the connection with the The implementation of the plugin is based on
name of the AMQP queues to be consumed. In the React Unity WebGL library, which allows em-
addition, a new subject has been added to the bedding Unity compilations exported to WebGL
policy that fulfils the twin whose data we pre- format in any application developed on the Re-
dict, and this subject has been included in the act framework. Likewise, it allows establishing
authorization context of the connection. To cre- communications between the Unity model and
ate the necessary credentials for the connections the React application in both directions. The li-
in RabbitMQ, the interface provided by the tool brary provides a UnityContext object that is ini-
itself has been used. Moreover, in order to dif- tialized with the four files that constitute the We-
16
bGL format. To access these files, the files must This sending is done after each modification of
be placed in the Grafana public folder. To display the data variable, that is, after each reception of
the compilation, it is only necessary to reference data in Grafana.
this object in a JSX element that is also provided To send data from the Unity model to the React
with the library. For the Unity model to work cor- application, it is necessary to establish an event
rectly, it is necessary to add in one of its scripts beforehand. In a specific folder inside the Unity
the disabling of the capture of all keyboard in- model, a JSLib file has to be added where the
puts, since otherwise it may interfere with the name of the event and the type of parameters
operation of the rest of the JavaScript elements to be sent have to be defined. This event can
that compose the application. be imported and used within Unity scripts that
require sending information to the React appli-
For receiving information in the Unity model
cation. To receive the data in the panel plugin,
coming from the React application, a function
the corresponding method of the UnityContext
must first be defined in each of the Unity ob-
object must be called, indicating the name of the
jects that wish to receive such information. This
event to wait for and the function to execute af-
function will receive the data sent by the appli-
ter receiving it. This function must have the pa-
cation as a parameter, typically as text or num-
rameters specified in the event. In our case, a
ber. On the application side, the send method of
script has been defined in Unity that, when a
the UnityContext object previously created must
Unity object is clicked on, sends an event with its
be used, indicating the name of the Unity ob-
identifier. The identifiers of these objects must
ject we are referring to, the name of the func-
correspond to those used for the construction of
tion mentioned above and, if any, the data to be
the twin in Eclipse Ditto. This identifier will be
sent. In our case, we want to send to the cor-
set as the value of a variable of the Grafana dash-
responding Unity objects the data obtained as
board where the panel is located. Grafana al-
a result of the query that the user has entered
lows the definition of several types of variables
the Grafana panel. This information is contained
that can be included in the queries to make the
in the data variable that Grafana provides as a
dashboards more dynamic and interactive. In
parameter of the component. This variable con-
this way, the rest of the dashboard panels will
tains a list of series, which will depend on how
be able to display information depending on the
the data are grouped.. There shall be one series
Unity object that is clicked in the 3D model.
for each group in the query result. For example,
All variable elements of the panel, such as
if you group the data first by thingId and then by
function name, event name, WebGL compila-
feature, and you have two twins with the same
tion location and dashboard variable name, have
two features, then you will create 4 groups and
been defined as panel options to allow the user
therefore 4 series. Each series will contain a set
to easily modify these values.
of fields, and each field will have a list with its
As with the application plugin developed in
values. The values of the last grouping column
a previous section, the developed code must
will be considered as independent fields, while
be added to the Grafana plugins folder to be
the values of the other grouping columns will be
used. Once activated, it can be included in any
shown as labels of those fields. Thus, in the ex-
Grafana dashboard, where it can be configured
ample above, each of the series would have a
and adapted to the user’s preferences.
time field and a field with the name of the fea-
ture by which it is grouped, both labelled with
the thingId to which they correspond. Taking 5. Use case: Virtual analyser in Petrochem-
all this into account, the plugin panel will ex- ical industry
tract the relevant data from this Grafana vari-
able, analyse it and build the message for the The digital twin platform here presented will
device that is being filtered. Once sent, it will be validated through an Petrochemical Industry
be received as a parameter in the corresponding 4.0 use case. The objective of the use case is to
Unity object function, which will act accordingly. define a virtual analyzer, that is able to predict
17
the freezing point of one of CEPSA end prod- {
ucts (lubricant) based on the operating condi- "thingId": "cepsa:LSRC3002.PF",
tions and the properties of the feedstock. This "policyId": "cepsa:basic_policy",
process is carried out in the San Roque (Spain) "attributes" : {
Energy Park of CEPSA, one of the largest re- "name": "LSRC3002.PF",
finery in Spain. The freezing point is an im- "description" : "Unit load",
portant parameter that, due to its characteris- "units" : "m3/d"
tics, must be measured in laboratory. Based on },
the monitoring of different operation conditions, "features": {
such as filters, the aim is to predict in Kafka-ML "last_measured": {
the state of the freezing point in real time for "properties": {
better control of the process. This continuous "value": null,
prediction, together with the status of the moni- "time": null
tored sensors, will be modelled in a digital twin }
within our framework. Digital twins have also }
been studied before in the Petrochemical Indus- }
try. For instance, in [17], a digital twin for pro- }
duction control purposes of a catalytic cracking
unit in the Petrochemical Industry is proposed.
Figure 13: Example of the Eclipse Ditto schematic for one of
In this work, we go further by considering the the sensors in the use case.
modelling, prediction, and 3D visualization for a
process in this industry.
The company has provided us with real-time name of each device coincides with the thingId
(through the MQTT protocol) and historical data of its corresponding twin, thus facilitating the
(to train ML models) from the necessary sensors, connection with Eclipse Ditto. Moreover, in this
which will be considered twins in their own right same connection, a JavaScript mapper has been
and together will constitute the main twin be- applied to convert the messages received into
ing sought. From them, the different digital twin the Ditto Protocol format.
types have been identified by grouping the sen- At this point, the twin should be receiving the
sors that are identical in operation and descrip- data correctly, and the status of the twins with
tion, and a Ditto Thing scheme has been defined respect to time will be stored in InfluxDB, so
for each type and twin. In this case, all the sen- Grafana dashboards can be created according to
sors receive a single value along with the time requirements.
it was taken, and, likewise, the format in which The freezing point predictive model that has
the data from each sensor is received is identical. been developed as the target of the use case has
In order to facilitate mapping and data consulta- been deployed in Kafka-ML. For its data input, a
tion, the feature section is the same for all twins simple script has been written that periodically
and types. Using the Digital Twins plugin for makes a call to Eclipse Ditto to collect the cur-
Grafana, connected to a Ditto-Extended-API ser- rent state of the twins that represent the sen-
vice, the different twins have been created. All sors required by the model. The output data
elements have been grouped in the same Eclipse of this model is collected by the Kafka-ML-to-
Ditto namespace (a logical way to group informa- Eclipse-Ditto service in order to update the freez-
tion and digital twins). As an example, the Ditto ing point feature contained in the digital twin.
Thing scheme for one of the sensors is shown in For the 3D representation of the twin’s state, a
Figure 13. model was created in Unity that contains enough
For sending real-time data, a tenant was cre- elements to represent each of the sensors that
ated in Eclipse Hono for the MQTT connection are part of the machine. In this model, each ele-
within which credentialed devices have been ment has been renamed with the ID of the sensor
added for each of the available sensors. The it represents, and necessary code has been im-
18
plemented for the movement of the camera and 6.1. Experimental setup
the selection of elements by clicking on them, Hardware configuration. All the experi-
as well as the script necessary for the model’s ments were performed on a five-node Kuber-
correct working in Grafana. Its WebGL export netes cluster in our private cloud infrastructure
has been added to the Grafana public folder, and in VMware vCloud. Each node has 4 virtual CPUs
the Unity panel has been included in one of the in 2 sockets and 16GB of RAM. The client that
boards. sent the information and from where the results
Figure 14 shows the final result of the digi- were measured was a PC with 64GB of RAM, 1
tal twin developed 3D representation for the de- CPU, and 10 cores.
scribed use case. As a result, an easily adaptable Software configuration. Each one of the
and extendable twin of the industry process has five nodes runs Kubernetes v1.19.3 and Docker
been obtained, with an eye-pleasing representa- 19.03.13 on top of Ubuntu 16.04.7 LTS. A Kuber-
tion of its real-time status, using different types netes master was deployed in one node, whereas
of graphics as well as a 3D model that, on receiv- the remaining four are Kubernetes workers. The
ing the data from the sensors, provides the pos- PC with the client runs Ubuntu server.
sibility of displaying data of interest on the ma-
chine, such as its real movement, and allowing 6.2. Test 1 - Essential functionality flow
any type of interaction with the user. Likewise, The test will evaluate the latency and through-
the platform allows easy querying of the cur- put from the moment data are sent to Eclipse
rent state of the twin, via the Eclipse Ditto API, Hono via MQTT to the moment they are stored
and its state over time, using any of the query in InfluxDB within the dataflow used in the Petro-
options provided by InfluxDB. Furthermore, in chemical Industry use case described. Here we
terms of predictive concerns, machine learning have two test cases, different number of sen-
models are easily integrated with the twins, with sors receiving data simultaneously, and different
the resulting value being considered as any other numbers of clients/connections sending data si-
feature of the twin. multaneously. The size of the data sent has not
been taken into account, because all the mes-
sages share the same format, so their size hardly
6. Evaluation varies.
19
Figure 14: 3D visualization of the digital twin for the use case
scenario not all the sensors of the platform will haviour is also normal as sending all messages
receive the data simultaneously or with the same to a single twin can overload it. Therefore, we
frequency, so we can determine that the platform can conclude that the system reacts correctly to
performs well as the number of sensors (here a coherent increase in the number of clients.
twins) affected increases.
6.3. Test 2 - Machine learning prediction flow
6.2.2. Related to number of clients In this test the latency and throughput will be
This case is similar to the previous one, with calculated for the machine learning integration
the difference that only data updates will be sent part of the architecture. Since there is only one
to a single sensor by a different number of sim- prediction model running in the use case for the
ulated clients using threads. The values sent in freezing point prediction, the test will always af-
each message are always unique, as each client fect the same Eclipse Ditto twin, and what we
increments by 0.01 a global value, starting at 0. will vary is the number of clients sending input
Likewise, the result of each test per number of data to the model. To relate each client to the
clients is the average of ten repetitions of this outcome of the model, we have used data in-
test. Figure 16 shows the results, which have puts with known results, avoiding their repeti-
also been limited to 27 clients to facilitate the tion during each of the tests. After the execution
understanding of the graph and the comparison of the test with a certain number of clients, In-
with the one explained above. fluxDB is consulted for the time in which each
As can be seen, the latency exceeds one sec- data has been stored, which will allow later com-
ond of delay after 20 simultaneous clients. The parison. The result of each test is the average
throughput, on the other hand, maintains a fair of 10 executions. Figure 17 shows the results
decrease. These results do not represent a prob- achieved in this test.
lem, since the most common stage is that a The results are very similar to those shown
twin or device receives data from a single data in the first test with respect to the number of
source, with the exception of simulated or pre- clients. Latency grows until it exceeds one sec-
dicted features, where the number of clients ond delay with 17 simultaneous clients. Simi-
could increase usually by one or two. This be- larly, throughput decreases substantially. These
20
(a) Latency (b) Throughput
Figure 15: Latency and throughput of test 1 for different number of sensors
Figure 16: Latency and throughput of test 1 for different number of clients
are acceptable results compared to the real-time is running, the pod corresponding to the service
data flow, as the difference between them is mi- to be tested is manually deleted in Kubernetes.
nor, and it must be taken into account that a If a message could not be sent, it will be resent
per-client prediction is made during the process. as many times as necessary, although maintain-
Moreover, in this case, as in the previous one, ing the initial time of the first sending. When the
it is usual for a single client to initiate the flow, end of the test is indicated, the data received will
so the limitation of simultaneous clients would be extracted from InfluxDB for comparison. The
not be a problem. Therefore, we can deter- maximum time difference of the test shall be con-
mine that the platform has a very good response sidered as the recovery time for that pod. Each
to the increase in the number of simultaneous test result is the average of 5 test runs. In Figure
clients during the prediction flow through ma- 18, we can see the results. The selected pods fol-
chine learning. low the sequence from sending a data to Eclipse
Hono until the moment the corresponding twin
6.4. Test 3 - Error tolerance is updated in Eclipse Ditto and are shown in the
graph in that order. The combination of two or
This last test is based on checking the plat- more pods has not been included as the result
form’s tolerance to errors. To do this, a script coincides with the maximum recovery time be-
has been created that sends a message to Eclipse tween them.
Hono via MQTT every half a second. While this
21
(a) Latency (b) Throughput
22
alize the status of the plant and its components deep learning, IEEE Transactions on Industrial Infor-
(e.g. filters) directly on the 3D representation, as matics (2021).
[12] V. Kamath, J. Morgan, M. I. Ali, Industrial iot and digital
well as the freezing point prediction in real time.
twins for a smart factory: An open source toolkit for ap-
As future work for the platform, we envisage plication design and benchmarking, in: 2020 Global In-
supporting the FMI (Functional Mock-up Inter- ternet of Things Summit (GIoTS), June 3, Online, IEEE,
face) standard in order to be able to simulate 2020, pp. 1–6.
[13] K. Shah, T. Prabhakar, C. Sarweshkumar, S. Abhishek,
complex processes that follow this standard and et al., Construction of a digital twin framework using
integrate them with the 3D representations of free and open-source software programs, IEEE Internet
the platform. Furthermore, we intend to gen- Computing (2021).
[14] R. P. Rolle, V. d. O. Martucci, E. P. Godoy, Modular
erate hybrid twins that can take advantage of
framework for digital twins: Development and perfor-
the low latency opportunities offered by edge/fog mance analysis, Journal of Control, Automation and
systems and demonstrate the viability of the plat- Electrical Systems 32 (6) (2021) 1485–1497.
form in other contexts (we are currently working [15] A. Khan, F. Shahid, C. Maple, A. Ahmad, G. Jeon, Toward
smart manufacturing using spiral digital twin frame-
in the agricultural sector).
work and twinchain, IEEE Transactions on Industrial
Informatics 18 (2) (2020) 1359–1366.
[16] C. Martín, P. Langendoerfer, P. S. Zarrin, M. Díaz, B. Ru-
References
bio, Kafka-ml: connecting the data stream with ml/ai
[1] M. Díaz, C. Martín, B. Rubio, State-of-the-art, chal- frameworks, Future Generation Computer Systems 126
lenges, and open issues in the integration of internet (2022) 15–33.
of things and cloud computing, Journal of Network and [17] Q. Min, Y. Lu, Z. Liu, C. Su, B. Wang, Machine learning
Computer applications 67 (2016) 99–117. based digital twin framework for production optimiza-
[2] D. De Silva, S. Sierla, D. Alahakoon, E. Osipov, X. Yu, tion in petrochemical industry, International Journal of
V. Vyatkin, Toward intelligent industrial informatics: A Information Management 49 (2019) 502–519.
review of current developments and future directions
of artificial intelligence in industrial applications, IEEE
Industrial Electronics Magazine 14 (2) (2020) 57–72.
[3] F. Tao, H. Zhang, A. Liu, A. Y. Nee, Digital twin in indus-
try: State-of-the-art, IEEE Transactions on Industrial
Informatics 15 (4) (2018) 2405–2415.
[4] A. Rasheed, O. San, T. Kvamsdal, Digital twin: Values,
challenges and enablers from a modeling perspective,
Ieee Access 8 (2020) 21980–22012.
[5] A. C. Márquez, A. de la Fuente Carmona, J. A. Marcos,
J. Navarro, Designing cbm plans, based on predictive
analytics and big data tools, for train wheel bearings,
Computers in Industry 122 (2020) 103292.
[6] A. A. Nazarenko, L. M. Camarinha-Matos, The role of
digital twins in collaborative cyber-physical systems, in:
Doctoral Conference on Computing, Electrical and In-
dustrial Systems, Springer, 2020, pp. 191–205.
[7] J. Conde, A. Munoz-Arcentales, A. Alonso, S. Lopez-
Pernas, J. Salvachua, Modeling digital twin data and
architecture: A building guide with fiware as enabling
technology, IEEE Internet Computing (2021).
[8] Y. Zheng, S. Yang, H. Cheng, An application framework
of digital twin and its case study, Journal of Ambient Julia Robles graduated in Soft-
Intelligence and Humanized Computing 10 (3) (2019)
1141–1153.
ware Engineering from the Uni-
[9] J. Cheng, H. Zhang, F. Tao, C.-F. Juang, Dt-ii: Digital versity of Málaga, Spain, in 2021.
twin enhanced industrial internet reference framework Since then, she is part of the ER-
towards smart manufacturing, Robotics and Computer-
TIS Research Group at the Uni-
Integrated Manufacturing 62 (2020) 101881.
[10] Y. Mo, S. Ma, H. Gong, Z. Chen, J. Zhang, D. Tao, Terra: versity of Málaga as a research as-
A smart and sensible digital twin framework for robust sistant and is a member of the ITIS Software In-
robot deployment in challenging environments, IEEE stitute at the University of Málaga. His research
Internet of Things Journal 8 (18) (2021) 14039–14050.
interests are mainly in the area of Digital Twins,
[11] H. V. Dang, M. Tatipamula, H. X. Nguyen, Cloud-based
digital twinning for structural health monitoring using Internet of Things, and Artificial Intelligence.
23
Cristian Martín received an MS
in Computer Engineering, an MS
in Software Engineering and Ar-
tificial Intelligence, and a PhD in
Computer Science from the Uni-
versity of Málaga, Spain, in 2014, 2015, and
2018 respectively. Currently, he is a Postdoctoral
researcher at the University of Málaga. Previ-
ously, he has worked as a software engineer in
various tech companies with RFID technology
and software development. He is also a member
of the ITIS Software Institute of the University
of Málaga. His research interests focus on the
integration of the Internet of Things with Cloud/-
Fog/Edge Computing, Machine Learning, Struc-
tural Health Monitoring, and IoT Reliability.
24