Internet of Robotic Things SAGE 2018
Internet of Robotic Things SAGE 2018
and applications
Abstract
The Internet of Robotic Things is an emerging vision that brings together pervasive sensors and objects with robotic and
autonomous systems. This survey examines how the merger of robotic and Internet of Things technologies will advance
the abilities of both the current Internet of Things and the current robotic systems, thus enabling the creation of new,
potentially disruptive services. We discuss some of the new technological challenges created by this merger and conclude
that a truly holistic view is needed but currently lacking.
Keywords
Internet of Things, cyber-physical systems, distributed robotics, network robot systems, autonomous systems, robot
ecology
Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License
(http://www.creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without
further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/
open-access-at-sage).
2 International Journal of Advanced Robotic Systems
a ‘robot-aided IoT’ view where robots are just additional a delicate task. In this work, we build upon an existing
sensors.20,21 community effort and adopt the taxonomy of nine system
Cloud computing and the IoT are two non-robotic abilities, defined in the euRobotics roadmap, 26 which
enablers in creating distributed robotic systems (see shapes the robotic research agenda of the European Com-
Figure 1). IoT technologies have three tenets22: (i) sensors mission. Interestingly, these abilities are closely related to
proliferated in the environment and on our bodies; the research challenges identified in the US Robotics road-
(ii) smart connected objects using machine-to-machine map27 (see Figure 2).
(M2M) communication; and (iii) data analytics and seman-
tic technologies transforming raw sensor data. Cloud com- Basic abilities
puting provides on-demand, networked access to a pool of
virtualized hardware resources (processing, storage) or Perception ability
higher level services. Cloud infrastructure has been used The sensor and data analytics technologies from the IoT
by the IoT community to deploy scalable IoT platform can clearly give robots a wider horizon compared to local,
services that govern access to (raw, processed or fused) on-board sensing, in terms of space, time and type of infor-
sensor data. Processing the data streams generated by bil- mation. Conversely, placing sensors on-board mobile
lions of IoT devices in a handful of centralized data centres robots allows to position them in a flexible and dynamic
brings concerns on response time latency, massive ingress way and enables sophisticated active sensing strategies.
bandwidth needs and data privacy. Edge computing (also A key challenge of perception in an IoRT environment
referred to as fog computing, cloudlets) brings on-demand is that the environmental observations of the IoRT entities
and elastic computational resources to the edge of the net- are spatially and temporally distributed.28 Some techniques
work, closer to the producers of data23. The cloud paradigm must be put in place to allow robots to query these distrib-
was also adopted by the robotics community, called cloud uted data. Dietrich et al.29 propose to use local databases,
robotics9–12 for offloading resource-intensive tasks,13,14 for one in each entity, where data are organized in a spatial
the sharing of data and knowledge between robots24 and for hierarchy, for example, an object has a position relative to a
reconfiguration of robots following an app-store model.25 robot, the robot is positioned in a room and so on. Other
Although there is an overlap between cloud robotics and authors30,31 propose that robots send specific observation
IoRT, the former paradigm is more oriented towards pro- requests to the distributed entities, for example, a region
viding network-accessible infrastructure for computational and objects of interest: this may speed up otherwise intract-
power and storage of data and knowledge, while the latter is able sensor processing problems (see Figure 3).
more focused on M2M communication and intelligent data A key component of robots’ perception ability is getting
processing. The focus of this survey is on the latter, dis- knowledge of their own location, which includes the ability
cussing the potential added value of the IoT-robotics cross- to build or update models of the environment.32 Despite
over in terms of improved system abilities, as well as the great progress in this domain, self-localization may still
new technological challenges posed by the crossover. be challenging in crowded and/or Global Positioning Sys-
As one of the goals of this survey is to inspire research- tem (GPS)-denied indoor environments, especially if high
ers on the potential of introducing IoT technologies in reliability is demanded. Simple IoT-based infrastructures
robotic systems and vice versa, we structure our discussion such as an radio frequency identification (RFID)-enhanced
along the system abilities commonly found in robotic floor have been used to provide reliable location informa-
systems, regardless of specific robot embodiment or appli- tion to domestic robots.33 Other approaches use range-
cation domains. Finding a suitable taxonomy of abilities is based techniques on signals emitted by off-board
Simoens et al. 3
Manipulation ability
While the core motivation of the IoT is to sense the envi-
ronment, the one of robotics is to modify it. Robots can
grasp, lift, hold and move objects via their end effectors.
Once the robot has acquired the relevant features of an
object, like its position and contours, the sequence of tor-
ques to be applied on the joints can be calculated via
inverse kinematics.
The added value of IoT is in the acquisition of the
object’s features, including those that are not observable
with the robot’s sensors but have an impact on the grasping
procedure, such as the distribution of mass, for example, in
a filled versus an empty cup. Some researchers attached
RFID tags to objects that contain information about their
Figure 3. Distributed cameras assist the robot in locating a size, shape and grasping points5. Deyle et al.46 embedded
charging station in an environment. The charging station was RFID reader antennas in the finger of a gripper: Differences
placed between a green and yellow visual marker (location A).
in the signal strength across antennas were used to more
Visual markers of the same colours were placed elsewhere in the
environment to simulate distractors. Visual processing is per- accurately position the hand before touching the object.
formed on-demand on the camera nodes to inform the robot that Longer range RFID tags were used to locate objects in a
the charging station is at location A and not at the distracting kitchen47 or in smart factories,48,49 as well as to locate the
location B (Image from Chamberlain et al.30) (c) 2016 IEEE. robots themselves.50
Figure 4. The vacuum cleaning robot adapts its plan to avoid interference in the kitchen (Figure from Cirillo et al.56) (c) 2010 ACM.
Figure 5. Depending on the state of the environment, a natural language instructions results in different actions to be performed
(Image from Misra et al.62) (c) 2016 SAGE.
IoT technologies can facilitate human–robot interaction at body-worn IoT sensors can improve this estimate by mea-
functional (commanding and programming) and social lev- suring physiological signals: Leite et al.69 measured heart
els, as well as a means for tele-interaction. rate and skin conductance to estimate engagement, motiva-
Functional pervasive IoT sensors can make the func- tion and attention during human–robot interaction. Others
tional means of human–robot interaction more robust. Nat- have used these estimates to adapt the robot’s interaction
ural language instructions are a desirable way to instruct strategy, for example, in the context of autism therapy70 or
robots, especially for non-expert users, but they are often for stress relievement.71
vague or contain implicit assumptions.61,62 The IoT can Tele-interaction robots have also been used besides IoT
provide information on the position and state of objects technologies for remote interaction, especially in healthcare.
to disambiguate these instructions (see Figure 5). Gestures Chan et al.72 communicate hugs and manipulations between
are another intuitive way to command robots, for instance, persons via sensorized robots. Al-Taee et al.73 use robots to
by pointing to objects. Recognition of pointing gestures from improve the tele-monitoring of diabetes patients by reading
sensors on-board the robot only works within a limited field out the glucose sensor and vocalizing the feedback from the
of view.63 External cameras provide a broader scene per- carer (see Figure 6). Finally, in the GiraffPlus project?, a
spective that can improve gesture recognition.64 Wearable tele-presence robot was combined with environmental sen-
sensors have also been used, for example, Wolf et al.65 sors to provide health-related data to a remote therapist.
demonstrated a sleeve that measures forearm muscle move-
ments to command robot motion and manipulation.
Social body cues like gestures, voice or face expression Cognitive ability
can be used to estimate the user’s emotional state66 and By reasoning on and inferring knowledge from experience,
make the robot respond to it.67,68 The integration with cognitive robots are able to understand the relationship
Simoens et al. 5
Figure 7. The Ubiquitous Network Robot Platform is a two-layered platform. The LPF configures a robotic system in a single area. The
GPF is a middle-layer between the LPFs of different areas and the service applications (Image from Nishio et al.86). LPF: local platform;
GPF: global platform (c) 2013 Springer-Verlag.
will now discuss relevant application domains and support- safety guarantees when cooperating with humans and the
ing platforms. degree to which systems can continue their missions when
Mobile robots are used in precision agriculture for the failures or other unforeseen circumstances occur.
deployment of herbicide, fertilizer or irrigation.88 These In this section, we follow the classification of means of
robots need to adapt to spatio-temporal variations of crop dependability identified by Crestani et al.100
and field patterns, crop sizes, light and weather conditions, A first means of dependability is to forecast faults or
soil quality, and so on.89 Wireless Sensor Network (WSNs) conflicts. For instance, robots in a manufacturing plant
can provide the necessary information,90,91 for example, must stop if an operator comes too near. IoT technology
knowledge of soil moisture may be used to ensure accurate can provide useful tools to realize this. Rampa et al.101
path tracking.92,93 Gealy et al.94 use a robot to adjust the mounted a network of small tranceivers in a robotic cell
drip rate of individual water emitters to allow for plant- and estimated the user position from the perturbations of
level control of irrigation. This is a notable example of how the radio field. Other researchers embedded sensors in
robots are used to adjust IoT devices. clothing and on the helmet. Qian et al.102 developed a
Some platforms supporting adaptation of IoRT have also probabilistic framework to avoid conflicts of robot and
been showcased in the context of Ambient Assisted Living human motion, by combining observations from fixed cam-
(AAL). Building on OSGi, a platform for IoT home auto- eras and on-board sensors with historical knowledge on
mation, AIOLOS exposes robots and IoT devices as reusa- human trajectories (see Figure 8).
ble and shareable services, and automatically optimizes the In a marine context, acoustic sensor networks have
runtime deployment across distributed infrastructure, for been used to provide information on water current and
example, by placing a shared data processing service closer ship positions to a path planner for underwater gliders,
to the source sensor.95,96 Bacciu et al.97,98 deploy recurrent to avoid collisions when they come to the surface103 or
neural networks on distributed infrastructure to automati- to preserve energy.104
cally learn user preferences, and to detect disruptive envi- A second means of dependability is robust system engi-
ronmental changes like the addition of a mirror.99
neering. This can take new forms in an IoRT system. For
instance, mobile wireless communication is a key enabler
for industry 4.0, where both field devices, fixed machines
Dependability and mobile AGV are connected. IoT protocols such as
Dependability is a multifaceted attribute, covering the WirelessHart or Zigbee Pro were designed to address the
reliability of hardware and software robotic components, industry concerns on reliability, security and cost.105 When
Simoens et al. 7
Figure 8. Sensory data from laser and global cameras are fed to the perception module, together with human motion patterns learned
by the modelling module. Then three types of abstracted observations are inputted to the controller: PAO, PRO and RSO. Using a
Partially Observable Markov Decision Process, a suitable navigational policy is generated (Image from Qian et al.102). PAO: people’s
action observation; PRO: people-robot relation observation; RSO: robot state observation (c) 2013 SAGE.
7. Kim JH.Ubiquitous robot. In: Bernd R (ed) Computational 24. Cieslewski T, Lynen S, Dymczyk M, et al. Map API-scalable
Intelligence, Theory and Applications. Berlin, Heidelberg: decentralized map building for robots. In: 2015 IEEE
Springer Berlin Heidelberg, 2005, pp. 451–459. international conference on robotics and automation (ICRA),
8. Ha YG, Sohn JC, Cho YJ, et al. Towards ubiquitous robotic Seattle, WA, USA, 26–30 May 2015, pp. 6241–6247. IEEE.
companion: design and implementation of ubiquitous robotic 25. Szlenk M, Zieliński C, Figat M, et al. Reconfigurable agent
service framework. ETRI J 2005; 27(6): 666–676. architecture for robots utilising cloud computing. In: Szewc-
9. Kehoe B, Patil S, Abbeel P, et al. A survey of research on zyk R, Zielinsky C and Kaliczynska M (eds) Progress in
cloud robotics and automation. IEEE Trans Autom Sci Eng automation, robotics and measuring techniques. Cham:
2015; 12(2): 398–409. Springer, 2015, pp. 253–264.
10. Hu G, Tay WP and Wen Y. Cloud robotics: architecture, 26. Multi-annual roadmap for horizon. 2020. SPARC Robotics,
challenges and applications. IEEE Network 2012; 26(3): euRobotics AISBL, Brussels, Belgium, 21 December 2017,
21–28. https://www.eu-robotics.net/sparc.
11. Qureshi B and Koubâa A. Five traits of performance enhance- 27. Christensen H. A roadmap for US robotics, from internet to
ment using cloud robotics: a survey. Procedia Comput Sci robotics. National Robotics Initiative 2.0, 2016. 21 December
2014; 37: 220–227. 2017. Online at http://cra.org/ccc.
12. Kamei K, Nishio S, Hagita N, et al. Cloud networked 28. Remy SL and Blake MB. Distributed service-oriented
robotics. IEEE Network 2012; 26(3): 28–34. robotics. IEEE Int Comput 2011; 15(2): 70–74.
13. Mohanarajah G, Hunziker D, D’Andrea R, et al. Rapyuta: a 29. Dietrich A, Zug S, Mohammad S, et al. Distributed management
cloud robotics platform. IEEE Trans Autom Sci Eng 2015; and representation of data and context in robotic applications.
12(2): 481–493. In: 2014 IEEE/RSJ international conference on intelligent
14. Salmerón-Garcı J, Íñigo-Blasco P, Dı́az del Rı́o F, et al. A robots and systems (IROS 2014), Chicago, IL, USA, 14–18
tradeoff analysis of a cloud-based robot navigation assistant September 2014, pp. 1133–1140. IEEE.
using stereo image processing. IEEE Trans Autom Sci Eng 30. Chamberlain W, Leitner J, Drummond T, et al. A distributed
2015; 12(2): 444–454. robotic vision service. In: 2016 IEEE international conference
15. Chowdhury AR. IoT and robotics: a synergy. PeerJ Preprints on robotics and automation (ICRA), Stockholm, Sweden, 16–
2017; 5: e2760v1. 21 May 2016, pp. 2494–2499. IEEE.
16. Simoens P, Mahieu C, Ongenae F, et al. Internet of robotic 31. Sprute D, Pörtner A, Rasch R, et al. Ambient assisted robot
things: context-aware and personalized interventions of assis- object search. In: Mokhtari M, Abdulrazak B and Aloulou H
tive social robots. In: 5e IEEE international conference on (eds) International conference on smart homes and health
cloud networking (IEEECloudNet 2016) (ed Giordano S), telematics. Cham: Springer, pp. 112–123.
Pisa, Italy, 3–5 October 2016, pp. 1–4. IEEE. 32. Thrun S and Leonard JJ. Simultaneous localization and map-
17. Bhavanam P and Jami MT. Performance assessment in inter- ping. In: Siciliano B and Khatib O (eds) Springer handbook of
net of robotic things based on IoT. IJITR 2017; 5(3): robotics. Cham: Springer, 2008, pp. 871–889.
6459–6462. 33. Khaliq AA, Pecora F and Saffiotti A. Inexpensive, reli-
18. Bharathi K and Anbarasan K. Design of multi robot system able and localization-free navigation using an RFID
using fuzzy based IoT. Int J Res Sci Eng 2017. 21 December floor. In: 2015 European conference on mobile robots
2017. Available at: www.ijrse.org. (ECMR), Lincoln, UK, 2–4 September 2015, pp. 1–7.
19. Razafimandimby C, Loscri V and Vegni AM. A neural net- IEEE.
work and IoT based scheme for performance assessment in 34. He S and Chan SHG. Wi-fi fingerprint-based indoor position-
internet of robotic things. In: 2016 IEEE first international ing: recent advances and comparisons. IEEE Commun Surv
conference on Internet-of-Things Design and Implementation Tutor 2016; 18(1): 466–490.
(IoTDI), Berlin, Germany, 4–8 April 2016, pp. 241–246. 35. Hassan NU, Naeem A, Pasha MA, et al. Indoor positioning
IEEE. using visible led lights: a survey. ACM Comput Surv (CSUR)
20. Oh J, Park Y, Choi J, et al. A rule-based context transforming 2015; 48(2): 20.
model for robot services in internet of things environment. In: 36. Karbownik P, Krukar G, Shaporova A, et al. Evaluation of
2017 14th international conference on, ubiquitous robots and indoor real time localization systems on the UWB based sys-
ambient intelligence (URAI), pp. 331–336. IEEE. tem case. In: 2015 international conference on indoor posi-
21. Scilimati V, Petitti A, Boccadoro P, et al. Industrial internet tioning and indoor navigation (IPIN2015) Banff, Canada,
of things at work: a case study analysis in robotic-aided envi- Jeju, South Korea, 28 June–1 July 2017.
ronmental monitoring. IET Wireless Sen Syst 2017; 7(5): 37. Luoh L. Zigbee-based intelligent indoor positioning system
155–162. soft computing. Soft Comput 2014; 18(3): 443–456.
22. Atzori L, Iera A and Morabito G. The internet of things: a 38. Bonaccorsi M, Fiorini L, Cavallo F, et al. A cloud robotics
survey. Comput Networks 2010; 54(15): 2787–2805. solution to improve social assistive robots for active and
23. Shi W, Cao J, Zhang Q, et al. Edge computing: vision and healthy aging. Int J Soc Robot 2016; 8(3): 393–408.
challenges. IEEE Internet of Things Journal 2016; 3(5): 39. Jadidi MG, Patel M and Miro JV. Gaussian processes online
637–646. DOI: 10.1109/JIOT.2016.2579198. observation classification for RSSI-based low-cost indoor
Simoens et al. 9
positioning systems. In: 2017 IEEE international conference 55. Cashmore M, Fox M, Long D, et al. Rosplan: planning in the
on, robotics and automation (ICRA), Singapore, 29 May–3 robot operating system. In: ICAPS, Austin Texas, USA, 25–
June 2017, pp. 6269–6275. IEEE. 30 January 2015, pp. 333–341. AAAI.
40. Cavallo F, Limosani R, Manzi A, et al. Development of a 56. Cirillo M, Karlsson L and Saffiotti A. Human-aware task
socially believable multi-robot solution from town to home. planning: an application to mobile robots. ACM Trans Intell
Cogn Comput 2014; 6(4): 954–967. Syst Technol (TIST) 2010; 1(2): 15:1–15:26, https://dl.acm.
41. Mutlu B and Forlizzi J. Robots in organizations: the role of org/citation.cfm?id=1869404.
workflow, social, and environmental factors in human-robot 57. Kovacs DL. A multi-agent extension of PDDL3.1. In: Pro-
interaction. In: ACM/IEEE international conference on ceedings of the 3rd workshop on the international planning
human-robot interaction (HRI), Amsterdam, The Nether- competition (IPC), ICAPS-2012 (eds Seilva JR and Bonet B),
lands, 12–15 March 2008, pp. 287–294. ACM. DOI: 10. Atibaia, Brazil, 25–29 June 2012, pp. 19–27. Brazil: Univer-
1145/1349822.1349860. sity of Sao Paulo.
42. FIROS. http://docs.firos.apiary.io/#reference/0/connect- 58. Alterovitz R, Koenig S and Likhachev M. Robot planning in
robot/ (accessed 9 September 2017). the real world: research challenges and opportunities. AI
43. Quigley M, Conley K, Gerkey B, et al. ROS: an open- Magazine 2016; 37(2): 76–84.
source robot operating system. In: ICRA workshop on open 59. Bidot J and Biundo S. Artificial intelligence planning for
source software, vol. 3. Kobe, Japan, 12–17 May 2009, p. ambient environments. In: Ultes S, Nothdurft F, Heinroth
5. IEEE. T, et al Next Generation Intelligent Environments. 2011,
44. Das SM, Hu YC, Lee CG, et al. Mobility-aware ad hoc rout- Cham: Springer, pp. 195–225.
ing protocols for networking mobile robot teams. J Commun 60. Kim J, Lee J, Kim J, et al. M2M service platforms: survey,
issues, and enabling technologies. IEEE Commun Surv Tutor
Networks 2007; 9(3): 296–311.
2014; 16(1): 61–76.
45. Sliwa B, Ide C and Wietfeld C. An omnetþþ based
61. Yazdani F, , Brieber B and Beetz M. Cognition-enabled robot
framework for mobility-aware routing in mobile robotic
control for mixed human-robot rescue teams. In: Menegatti E,
networks. CoRR 2016; abs/1609.05351. 21 December
Michael N, Berns K, et al (eds) Intelligent Autonomous Sys-
2017. http://arxiv.org/abs/1609.05351.
tems 13. Cham: Springer, 2016, pp. 1357–1369.
46. Deyle T, Tralie CJ, Reynolds MS, et al. In-hand radio fre-
62. Misra DK, Sung J, Lee K, et al. Tell me dave: context-
quency identification (RFID) for robotic manipulation. In:
sensitive grounding of natural language to manipulation
2013 IEEE international conference on robotics and automa-
instructions. Int J Robot Res 2016; 35(1–3): 281–300.
tion (ICRA), Karlsruhe, Germany, 6–10 May 2013, pp.
63. Wachs JP, Kölsch M, Stern H, et al. Vision-based hand-
1234–1241. IEEE.
gesture applications. Commun ACM 2011; 54(2): 60–71.
47. Rusu RB, Gerkey B and Beetz M. Robots in the kitchen:
64. Hawkins KP, Vo N, Bansal S, et al. Probabilistic human
exploiting ubiquitous sensing and actuation. Robot Auton Syst
action prediction and wait-sensitive planning for respon-
2008; 56(10): 844–856.
sive human-robot collaboration. In: 2013 13th IEEE-RAS
48. Zhong RY, Dai Q, Qu T, et al. RFID-enabled real-time man-
international conference on, humanoid robots (Humanoids),
ufacturing execution system for mass-customization produc- Atlanta, GA, USA, 15–17 October 2013, pp. 499–506.
tion. Robot Comput Int Manuf 2013; 29(2): 283–292. IEEE.
49. Wan J, Tang S, Hua Q, et al. Context-aware cloud robotics for 65. Wolf MT, Assad C, Vernacchia MT, et al. Gesture-based
material handling in cognitive industrial internet of things. robot control with variable autonomy from the JPL biosleeve.
IEEE Int Things J 2017 http://ieeexplore.ieee.org/document/ In: 2013 IEEE international conference on, robotics and
7983343/. automation (ICRA), Karlsruhe, Germany, 6–10 May 2013,
50. Deyle T, Nguyen H, Reynolds M, et al. RFID-guided robots pp. 1160–1165. IEEE.
for pervasive automation. IEEE Pervasive Comput 2010; 66. Cambria E. Affective computing and sentiment analysis.
9(2): 37–45. IEEE Intell Syst 2016; 31(2): 102–107.
51. Fortino G, Guerrieri A, Russo W, et al. Middlewares for smart 67. De Ruyter B, Saini P, Markopoulos P, et al. Assessing the
objects and smart environments: overview and comparison. effects of building social intelligence in a robotic interface for
In: Fortino G and Trunfio P (eds) Internet of Things Based on the home. Int comput 2005; 17(5): 522–541.
Smart Objects. Cham: Springer, 2014, pp. 1–27. 68. McColl D, Hong A, Hatakeyama N, et al. A survey of auton-
52. Han SN, Khan I, Lee GM, et al. Service composition for IP omous human affect detection methods for social robots
smart object using realtime web protocols: concept and engaged in natural HRI. J Intell Robot Syst 2016; 82(1):
research challenges. Comput Stand Interf 2016; 43: 79–90. 101–133.
53. Guerrieri A, Loscri V, Rovella A, et al. Management of cyber 69. Leite I, Henriques R, Martinho C, et al. Sensors in the wild:
physical objects in the future internet of things: methods, exploring electrodermal activity in child-robot interaction. In:
architectures and applications. Cham: Springer, 2016. 2013 8th ACM/IEEE international conference on human-
54. Ghallab M, Nau D and Traverso P. Automated Planning: robot interaction (HRI), Tokyo, Japan, 3–6 March 2013,
Theory and Practice. Elsevier, 2004. pp. 41–48. IEEE. DOI: 10.1109/HRI.2013. 6483500.
10 International Journal of Advanced Robotic Systems
70. Bekele E and Sarkar N. Psychophysiological feedback for 85. Järvenpää E, Siltala N and Lanz M. Formal resource and
adaptive human–robot interaction (HRI). In: Advances in capability descriptions supporting rapid reconfiguration of
physiological computing. Springer, 2014, pp. 141–167. assembly systems. In: 2016 IEEE international symposium
71. Tapus A and Thi-Hai-Ha D. Stress game: The role of motiva- on, assembly and manufacturing (ISAM), Fort Worth, TX,
tional robotic assistance in reducing users task stress. Int J USA, 21–22 August 2016, pp. 120–125. IEEE.
Soc Robot 2015; 7(2): 227–240. 86. Nishio S, Kamei K and Hagita N. Ubiquitous network robot
72. Chen M, Ma Y, Hao Y, et al. Cp-robot: cloud-assisted pillow platform for realizing integrated robotic applications. Intell
robot for emotion sensing and interaction. In: International Auton Syst 2013; 12: 477–484.
conference on industrial IoT technologies and applications 87. Broxvall M. The PEIS kernel: a middleware for ubiquitous
(eds Wan J, Humar I and Zhang D). Cham: Springer, pp. robotics. In: Proceedings of the IROS-07 workshop on ubi-
81–93. quitous robotic space design and applications, San Diego,
73. Al-Taee MA, Al-Nuaimy W, Muhsin ZJ, et al. Robot assis- CA, USA, October 29–November 02 2007, pp. 212–218.
tant in management of diabetes in children based on the inter- IEEE.
net of things. IEEE Int Things J 2017; 4(2): 437–445. 88. Emmi L, Gonzalez-de Soto M, Pajares G, et al. New trends in
74. Lieto A. Representational limits in cognitive architectures. robotics for agriculture: integration and assessment of a real
Proceedings of EUCognition 2016. Cogn Robot Arch, fleet of robots. Sci World J 2014; 2014: 21.
Vienna, Austria, 8–9 December 2016, pp. 16–20. European 89. Bac CW, Henten EJ, Hemming J, et al. Harvesting robots for
Society for Cognitive Systems. high-value crops: State-of-the-art review and challenges
75. Oltramari A and Lebiere C. Pursuing artificial general intel- ahead. J Field Robot 2014; 31(6): 888–911.
ligence by leveraging the knowledge capabilities of ACTR. 90. Ojha T, Misra S and Raghuwanshi NS. Wireless sensor net-
In: Bach J, Goertzel B and Iklé M (eds) AGI. Berlin, Heidel- works for agriculture: The state-of-the-art in practice and
berg: Springer-Verlag Berlin, pp. 199–208. future challenges. Comput Electron Agric 2015; 118: 66–84.
76. Compton M, Barnaghi P, Bermudez L, et al. The SSN ontol- 91. Srbinovska M, Gavrovski C, Dimcev V, et al. Environmental
ogy of the W3C semantic sensor network incubator group. parameters monitoring in precision agriculture using wireless
Web Semant Sci Serv Agents World Wide Web 2012; 17: sensor networks. J Clean Product 2015; 88: 297–307.
25–32. 92. Han XZ, Kim HJ, Kim JY, et al. Path-tracking simulation and
77. Seydoux N, Drira K, Hernandez N, et al. IoT-O, a field tests for an auto-guidance tillage tractor for a paddy
Core-Domain IoT Ontology to Represent Connected Devices field. Comput Electron Agric 2015; 112: 161–171.
Networks, 21 December 2017. Cham: Springer International 93. Matveev AS, Hoy M, Katupitiya J, et al. Nonlinear sliding
Publishing. ISBN 978-3-319-49004-5, 2016, pp. 561–576. mode control of an unmanned agricultural tractor in the pres-
DOI: 10.1007/978-3-319-49004-5 36. ence of sliding and control saturation. Robot Auton Syst 2013;
78. Prestes E, Carbonera JL, Fiorini SR, et al. Towards a core 61(9): 973–987.
ontology for robotics and automation. Robot Auton Syst 2013; 94. Gealy DV, McKinley S, Guo M, et al. Date: A handheld
61(11): 1193–1204. co-robotic device for automated tuning of emitters to
79. Jorge VA, Rey VF, Maffei R, et al. Exploring the IEEE enable precision irrigation. In: 2016 IEEE international
ontology for robotics and automation for heterogeneous agent conference on, automation science and engineering
interaction. Robot Comput Int Manuf 2015; 33: 12–20. (CASE), Fort Wort,TX, USA, 21–25 August 2016, pp.
80. Foteinos V, Kelaidonis D, Poulios G, et al. Cognitive man- 922–927. IEEE.
agement for the internet of things: a framework for enabling 95. Verbelen T, Simoens P, De Turck F, et al. AIOLOS: middle-
autonomous applications. IEEE Vehic Technol Magaz 2013; ware for improving mobile application performance through
8(4): 90–99. cyber foraging. J Syst Software 2012; 85(11): 2629–2639.
81. Mezghani E, Exposito E and Drira K. A model-driven meth- 96. De Coninck E, Bohez S, Leroux S, et al. Middleware platform
odology for the design of autonomic and cognitive IoT-based for distributed applications incorporating robots, sensors and
systems: application to healthcare. IEEE Trans Emerg Topics the cloud. In: 2016 5th IEEE international conference on,
Comput Intell 2017; 1(3): 224–234. cloud networking (Cloudnet), pp. 218–223. IEEE.
82. Kousi N, Koukas S, Michalos G, et al. Service oriented archi- 97. Bacciu D, Chessa S, Gallicchio C, et al. A general purpose
tecture for dynamic scheduling of mobile robots for material distributed learning model for robotic ecologies. IFAC Proc
supply. Procedia CIRP 2016; 55: 18–22. Vol 2012; 45(22): 435–440.
83. Michalos G, Makris S, Aivaliotis P, et al. Autonomous pro- 98. Verstraeten D, Schrauwen B, D’Haene M, et al. An experi-
duction systems using open architectures and mobile robotic mental unification of reservoir computing methods. Neural
structures. Procedia CIRP 2015; 28: 119–124. Netw 2007; 20(3): 391–403. DOI: 10.1016/j.neunet.2007.04.
84. Reis J. Towards an industrial agent oriented approach for 003. http://www.sciencedirect.com/science/article/pii/S0893
conflict resolution. In: 9th Doctoral Symposium in Infor- 60800700038X (accessed 21 December 2017). Echo State
matics Engineering (DSIE) (eds Oliveira E and Souse A), Networks and Liquid State Machines.
Porto, Portugal, 30–31 January 2014, pp. 9–20. University 99. Bacciu D, Gallicchio C, Micheli A, et al. Learning context-
of Porto. aware mobile robot navigation in home environments. In: The
Simoens et al. 11
5th international conference on, information, intelligence, 104. Zhang Y, Chen H, Xu W, et al. Spatiotemporal tracking of
systems and applications, IISA 2014, Chania, Greece, 7–9 ocean current field with distributed acoustic sensor network.
July 2014, pp. 57–62. IEEE. IEEE J Ocean Eng 2017; 42(3): 681–696.
100. Crestani D, Godary-Dejean K and Lapierre L. Enhancing 105. Wang Q and Jiang J. Comparative examination on architec-
fault tolerance of autonomous mobile robots. Robot Auton ture and protocol of industrial wireless sensor network stan-
Syst 2015; 68: 140–155. dards. IEEE Commun Surv Tutor 2016; 18(3): 2197–2219.
101. Rampa V, Vicentini F, Savazzi S, et al. Safe human-robot 106. Jarchlo EA, Haxhibeqiri J, Moerman I, et al. To mesh or not
cooperation through sensor-less radio localization. In: 2014 to mesh: flexible wireless indoor communication among
12th IEEE international conference on, industrial infor- mobile robots in industrial environments. In: Mitton N, Loscri
matics (INDIN), Porto Alegre, Brazil, 27–30 July 2014, V and Mouradian A (eds) International conference on ad-hoc
pp. 683–689. IEEE. networks and wireless. Cham: Springer, pp. 325–338.
102. Qian K, Ma X, Dai X, et al. Decision-theoretical navigation of 107. Lindhorst T and Nett E. Dependable communication for
service robots using POMDPs with human-robot co- mobile robots in industrial wireless mesh networks. In: Kou-
occurrence prediction. Int J Adv Robot Syst 2013; 10(2): 143. bâa A and Dios J Ramiro- Martinez-de (eds) Cooperative
103. Pereira AA, Binney J, Jones BH, et al. Toward risk aware Robots and Sensor Networks 2015. Cham: Springer, 2015,
mission planning for autonomous underwater vehicles. In: pp. 207–227.
2011 IEEE/RSJ international conference on, intelligent 108. Bohez S, Verbelen T, De Coninck E, et al. Sensor fusion for
robots and systems (IROS), San Francisco, CA, USA, 25– robot control through deep reinforcement learning. arXiv
30 September 2011, pp. 3147–3153. IEEE. preprint arXiv:170304550 2017.