0% found this document useful (0 votes)
13 views14 pages

Mobile Manipulation Hackathon

The Mobile Manipulation Hackathon, held in late 2018 at the IROS conference, aimed to showcase the latest applications of wheeled robotic manipulators through an open-format competition. The event highlighted the complexities of integrating mobility and manipulation in robotics, emphasizing the need for further development in the field to enhance real-world applications. The paper discusses the competition's structure, the tasks performed by teams, and insights gained to improve future hackathons and the mobile manipulation community.

Uploaded by

nihokex389
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views14 pages

Mobile Manipulation Hackathon

The Mobile Manipulation Hackathon, held in late 2018 at the IROS conference, aimed to showcase the latest applications of wheeled robotic manipulators through an open-format competition. The event highlighted the complexities of integrating mobility and manipulation in robotics, emphasizing the need for further development in the field to enhance real-world applications. The paper discusses the competition's structure, the tasks performed by teams, and insights gained to improve future hackathons and the mobile manipulation community.

Uploaded by

nihokex389
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Mobile Manipulation Hackathon:

Moving into real world applications


Maximo A. Roa, Mehmet Dogar, Jordi Pages, Carlos Vivas, Antonio Morales, Nikolaus Correll,
Michael Gorner, Jan Rosell, Sergi Foix, Raphael Memmesheimer, Francesco Ferro

DOI: 10.1109/MRA.2021.3061951

Published version is available under


https://ieeexplore.ieee.org/document/9380912.

© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained
for all other uses, in any current or future media, including reprinting/republishing this material
for advertising or promotional purposes, creating new collective works, for resale or
redistribution to servers or lists, or reuse of any copyrighted component of this work in other
works. Uploaded in accordance with the publisher's self-archiving policy.
Mobile Manipulation Hackathon: Moving into real world applications
Máximo A. Roa1 , Mehmet Dogar2 , Jordi Pages3 , Carlos Vivas3 , Antonio Morales4 , Nikolaus Correll5 ,
Michael Görner6 , Jan Rosell7 , Sergi Foix8 , Raphael Memmesheimer9 , Francesco Ferro3

Abstract— The Mobile Manipulation Hackathon was held perception, navigation, task, path and grasp planning, control,
in late 2018 at the IEEE/RSJ International Conference on error recovery, human-robot interaction, and robotic hard-
Intelligent Robots and Systems (IROS) to showcase latest ware development. Each field is an area of research in its own
applications of wheeled robotic manipulators. The challenge
had an open format, where the teams developed their chosen right, but the particular challenge in mobile manipulation
application for a specific robotic platform, using simulation tools is to obtain an integrated system that can combine a large
and afterwards integrating it into the robotic system. This paper variety of hardware and software components to increase the
presents the competition and analyzes the results, with informa- range of tasks that the robot can perform, while decreasing
tion gathered during the competition days and from a survey the dependency on prior information and increasing the
circulated among the finalist teams. We provide an overview of
the mobile manipulation field, identify key areas required for awareness the robot has of its current situation.
further development to facilitate the implementation of mobile As the complexity of mobile manipulation lies at the
manipulators on real applications, and discuss ideas on how to interface of the different fields mentioned above, and any
structure future hackathon-style competitions to enhance their significant experimentation will not only require mastery of
impact on the scientific and industrial community. a variety of techniques but system integration and acquisition
of hardware, it is difficult to establish mobile manipulation as
I. I NTRODUCTION
a field of its own. Similarly, it is not clear what the commer-
Autonomous mobile manipulation combines two funda- cial applications of mobile manipulation really are. While
mental robotic skills: mobility in the environment and ma- performing truly human-like tasks is only possible when
nipulation of objects. The ability to do both simultane- combining mobility and manipulation, the high cost and
ously opens numerous applications in diverse areas including limited performance emphasize commercial solutions that are
manufacturing, logistics, home automation and healthcare. either only mobile, such as floor cleaning or transport, only
Such applications typically require complex (structured and manipulation, such as conventional robotic assembly lines, or
unstructured) manipulation. They also require navigation in constrain the system in such a way that manipulation remains
large spaces, possibly in cooperation or close interaction with trivial, for example picking up and transporting entire shelves
human beings or other robotic systems. in warehouses. However, other applications such as telep-
Mobile manipulation is a complex field. Mobility in- resence and remote assistance systems are moving toward
troduces additional pose uncertainty to the manipulation demanding some way to remotely interact with objects and
problem, while limiting the available perception systems and persons, for instance in elderly assistance scenarios. Also,
introducing additional constraints to the navigation problem industrial scenarios might be able to solve multiple tasks
that now needs to also consider one or more arms mounted on with fixed-base manipulators, but a single, flexible mobile
the robot. Mobile manipulation is also a systems challenge, platform could autonomously take over multiple tasks in
requiring the designer to draw on multiple different fields: different locations, thus possibly improving the return on
investment of the robot, especially important for the case
1 M. Roa is with the Institute of Robotics and Mechatron-
of small and medium enterprises (SMEs) that cannot afford
ics, German Aerospace Center (DLR), Wessling, Germany. Email:
[email protected]. multiple static robotic platforms.
2 M. Dogar is with the University of Leeds, UK. Email: To address these challenges and build a community around
[email protected]. mobile manipulation, the IEEE Robotics and Automation
3 J. Pages, C. Vivas and F. Ferro are with PAL Robotics,
Barcelona, Spain. Email: [email protected], Society (RAS) Technical Committee (TC) on Mobile Ma-
[email protected], nipulation together with their members and collaborators
[email protected]. organized a “Hackathon” — a word combining “hacking”
4 A. Morales is with the Robotic Intelligence Laboratory at Universitat
Jaume I, Castellon, Spain. Email: [email protected].
and “marathon” — that gives common ground to participants
5 N. Correll is with the Department of Computer Science, University of by providing a complete mobile manipulation system offer-
Colorado Boulder, USA. Email: [email protected]. ing a basic level of operation. This allows the community
6 M. Görner is with University of Hamburg, Germany. Email:
[email protected].
to showcase (1) their work in relevant sub-fields such as
7 J. Rosell is with Institut d’Organització i Control - Universitat grasping, manipulation, perception or motion-planning, and
Politècnica de Catalunya, Spain. Email: [email protected]. (2) application domains that might truly benefit from a
8 S. Foix is with Institut de Robòtica i Informàtica Industrial, CSIC-UPC,
mobile manipulation solution.
Spain. Email: [email protected].
9 R. Memmesheimer is with Universität Koblenz-Landau, Germany. The hackathon phenomenon has been described in the
Email: [email protected]. context of digital innovation as an appropriate vehicle to
bring people from different disciplines together as well as to on soft manipulators. Similarly, the tasks do not require
actually engage the community with a particular topic [1]. mobility.
Consequently, a body of work exists on how to design There have also been recent competitions that target mo-
a hackathon to optimize the desired outcome in terms of bile manipulation. The FetchIt! Mobile Manipulation Chal-
networking [2], learning [3], or broadening participation in lenge was held at the IEEE Int. Conf. on Robotics and
computing [4]. In its purest form, the hackathon format Automation (ICRA) 2019 [10]. The task was to assemble
therefore brings groups of unrelated people together to share a kit formed by six objects obtained from stations around
knowledge and work towards a solution, learn from each a designated arena, combining navigation and manipulation
other, and potentially form long-term connections. skills. Similarly, the RoboCup@Home competition4 , using
Given the current state of the art in hardware and software, the Toyota HSR [11] robot as the official platform, includes
we deemed it unlikely of getting significant insights from an a set of tidying up or service tasks in living room or
ad-hoc event in which teams are formed at the conference kitchen set ups, which require mobile manipulation. The
venue, with no previous contact or chance to learn about the RoboCup@Home also encourages teams to make “Open
available tools. Instead, the Hackathon has been organized as Challenge” demonstrations (i.e. free demonstrations deter-
a multi-staged competition from which finalist teams were mined by the teams, instead of the fixed set of tasks), though
selected based on an initial entry mostly based in simulation these open demonstrations are not the main focus, they are
results. performed at off-hours of the competition, and therefore the
“Open Challenge” award is not necessarily awarded [12].
Related hackathons and competitions
The SciRoc Challenge [13], which is organized as part of the
Robotic competitions have very similar aims as a European Robotics League and builds on the success of the
hackathon, but operate with a different time scale (months European Robotics Challenge (EuRoC) [14], also includes
of preparation vs. a single day, for example) and emphasize a fixed set of mobile manipulation tasks, such as delivering
robust solutions above prototypes. Competitions have a long coffee shop orders, and shopping pick and pack.
history in robotics and artificial intelligence with their entries The unique feature of our Hackathon, compared to the
often determining the state-of-the-art for years to come, such competitions above, is that it brings mobile manipulation
as in localization [5] or autonomous driving [6]. They can together with open demonstrations at the center stage. As
also lead to unexpected insights on what the problems in explained above, recently there have been multiple mo-
a systems challenge really are. For example, the Amazon bile manipulation competitions that focus on a fixed set
Picking Challenge [7] has shown that warehouse picking is of tasks. This has the advantage of creating benchmark
less of a grasping and manipulation challenge (the majority tasks that enable measuring progress objectively. Therefore,
of teams used suction) rather than a perception problem. such benchmark competitions are crucial for the community.
Similarly, the Industrial Assembly Challenge [8] has shown However, we believe an open format also has its place among
that perception and planning are secondary when dealing competition formats: It allows (a) the teams to demonstrate
with sufficiently restricted and well-defined problems. their core research innovations more directly, and (b) the
Despite much progress in these research domains, open- community/audience to get informed about the state-of-the-
loop control as well as mechanical templates and fixtures art for a rich variety of tasks. With the Mobile Manipulation
usually excel in such scenarios. These insights can then be Hackathon, our goal has been to push the teams to perform
used to refine the competition format to push the community their own research demonstrations, to identify the tasks that
in a desired direction. the research community is working on.
Many successful competitions focused on robotic manip- In this article, we explain the structure of the Mobile
ulation have been organized in recent years. The Robotic Manipulation Hackathon that was hosted together with the
Grasping and Manipulation Competitions have been orga- IEEE/RSJ Int. Conf. on Intelligent Robots and Systems
nized at IEEE IROS 2016 [9], 2017, 2019 and 20201 (online). (IROS) in 2018, discuss the applications developed by dif-
They included a fixed set of tasks, for example Service ferent teams and their performances, while presenting an
tasks (such as spooning peas, or preparing iced tea), Manu- overview of the current state of mobile manipulation. Based
facturing tasks (assembly/disassembly), and Logistics tasks on these observations, we discuss system advances that are
(bin picking). The tasks did not require mobility. The Real needed to enable even more fertile multi-day hackathons, as
Robot Challenge2 is organized by the Max Planck Institute well as lessons learned on how to structure future hackathons
for Intelligent Systems (MPI-IS) in 2020. This competition to improve our understanding of the specific challenges and
is based on remote execution of submitted software on a applications of mobile manipulation.
robotic hand hosted at MPI-IS. There is a fixed set of tasks
such as grasping and pushing, which do not require mobility. II. T HE F IELD OF M OBILE M ANIPULATION
The IEEE Int. Conf. on Soft Robotics (RoboSoft) also holds
Bringing together mobility and manipulation, mobile ma-
a competition3 with a manipulation challenge that focuses
nipulation systems need to overcome some of the most
1 https://rpal.cse.usf.edu/competition iros2020/ difficult challenges in robotics:
2 https://real-robot-challenge.com/
3 http://www.robosoft2019.org/robosoft competition.html 4 https://athome.robocup.org/
• Generality: Mobile manipulation systems must perform three decades and a half. The first prototype of a mobile
a variety of tasks, acquire new skills, and apply these manipulator was MORO back in 1984 [25]. The first relevant
skills in novel situations. They must be able to contin- attempts to mount robotic arms on mobile platforms hap-
uously adapt and improve their performance. pened during the 90s, with robots such as HERMIES [26],
• High dimensional state space: Versatile robotic systems and KAMRO [27]. The particular problem of coordination
must be equipped with many actuators and sensors, of base and arm motions also had seminal contributions on
resulting in high-dimensional state spaces for planning these years [28], [29]. Since then and over the last three
and control. decades there have been many developments and highlights
• Uncertainty: The ability to locomote, the required gener- in wheeled manipulation systems. Hvilshøf et al. surveyed
ality in task execution, and the usage of multiple sensors up to 30 different prototypes developed until 2011 [30]. The
and actuators, make it impractical to engineer the entire main application domains of mobile manipulation systems
environment for the task. As a result, mobile manipu- ranged from domestic service [31], [32] through space [33]
lation systems have to explicitly address problems that to industry, with commercial solutions from e.g. KUKA5 or
arise due to the uncertainty of sensing and actuation. NEOBOTIX6 .
• System complexity: Mobile manipulation systems re- Around 2010 a wave of more advanced, bi-manual multi-
quire the integration of a large number of hardware purpose wheeled manipulators started (Fig. 1) with systems
components for sensing, manipulation and locomotion, such as the PR2 [32] developed at Willow Garage, the
as well as the orchestration of algorithmic capabilities Care-O-bot 3 [34] developed at Fraunhofer AIS, HERB [35]
in perception, manipulation, control, planning, etc. developed at CMU, Rollin’ Justin [36] developed at DLR and
The mobility of these systems can take multiple forms the ARMAR series developed at KIT [37]. This wave rep-
depending on the environment: air/space (drones, planes, resented a milestone since it coincided with the introduction
helicopters, satellites), water (ships, submarines) or land of ROS (Robot Operating System) [38] to the community,
(wheeled, legged robots). In the air/space, mobile manip- which, through its modular structure and components such as
ulation systems often take the shape of an aerial vehicle the ROS Navigation Stack7 and MoveIt!8 , made it easier to
carrying some sort of manipulator [15], [16], e.g. a gripper build the complex software systems controlling these robots.
[17] or a multi-link arm [18], [19] attached to a rotorcraft, 2010 was also the year when the IEEE-RAS Technical
or a manipulator endowed with some flying mechanism, Committee on Mobile Manipulation was established.
e.g. rotors [20]. A significant challenge for these systems Though this series of wheeled manipulation systems have
is to maintain flight stability during object manipulation, created a lot of excitement and interest in mobile manipula-
which limits the range of manipulation operations that can tion and its applications over the years, it also revealed the
be performed. This coupling between the control of mobility challenges. The cost of building such systems was especially
and manipulation also exists in the water, where the robot prohibitive for large scale use and adoption, hampering
needs to maintain a stable pose while experiencing additional the development of a larger research community. Early
forces due to object manipulation [21], [22]. Land is the most adopters of mobile manipulators were the military and law
common environment for mobile manipulation. Humans live enforcement areas, who used robots for dangerous missions
on land and, therefore, a larger variety of mobile manipu- including bomb defusal or remote inspection of installations.
lation tasks can be found here. Furthermore, the control of In the last few years a rise of simpler yet fully integrated
mobility and manipulation can be decoupled more easily on and commercially-oriented wheeled manipulation systems
land, when compared to in-air or underwater manipulation: has been observed. These developments include TIAGo9
A land robot can attain a statically stable configuration and, (unimanual) and TIAGo++ (bimanual) by PAL Robotics,
for small enough forces, not worry about balancing during Fetch Mobile Manipulator10 by Fetch Robotics (available
manipulation. for researchers), Swift11 from IAM robotics, RB-1, RB-
Two common forms of mobility on land are legs and Kairos, RB-Eken and RB-Vulcano systems from Robotnik12 ,
wheels. Legged locomotion and bimanual manipulation are industrially-oriented KUKA KMR13 , and assistance-oriented
typically combined in humanoid robots, e.g. [23]. Even
though planning and control for legged locomotion can 5 https://www.kuka.com/en-gb/products/mobility/
be more complex than for wheeled locomotion, legs can mobile-robots
be advantageous depending on the ground characteristics. 6 https://www.neobotix-roboter.de/produkte/

Particularly for search and rescue operations, where debris, mobile-manipulatoren


7 http://wiki.ros.org/navigation
obstacles, and steps on the ground are expected, legged 8 https://moveit.ros.org/
mobile manipulation is preferred. Such systems dominated, 9 http://pal-robotics.com/robots/tiago/
for example, the DARPA Robotics Challenge [24]. 10 https://fetchrobotics.com/robotics-platforms/
The most common and versatile mobile manipulation sys- fetch-mobile-manipulator/
11 https://www.iamrobotics.com/our-solution/
tems, however, are wheeled systems. Wheeled systems strike
12 https://robotnik.eu/products/
the right balance between ease of mobility and manipulation,
mobile-manipulators/
and access to most human environments. The development 13 https://www.kuka.com/en-gb/products/mobility/
of wheeled mobile manipulators has spawned over the last mobile-robots/kmr-iiwa
Fig. 1: Timeline for development of wheeled robotic manipulators in the last decade.

Toyota HSR14 . The field is still in evolution, and interesting cases of human-robot collaboration. There have been recent
concepts have been recently presented, such as Handle15 important efforts in this direction, even though there is still
from Boston Dynamics, and Stretch16 from Hello Robot. uncertainty about regulations covering the use of mobile
Fig. 1 presents a timeline of development of these wheeled manipulators. Depending on the area of application different
robotic manipulators. These systems target applications such regulations apply. For example, in industrial settings many
as part supply and transport in manufacturing and logistics, integrators apply both ISO 10218-1 about Safety of indus-
and object transport and human-interaction in healthcare and trial robots and ISO/TS 15066 about Collaborative Robots
personal care. Yet the mobile manipulation market is still a when the manipulator of the mobile robot is in action, and
niche, and estimations of the market for this type of systems they apply either ISO 3691-4 or former EN 1175-1:1998
are difficult to obtain. For instance, the latest report from when the robot navigates by keeping the arm static to
the International Federation of Robotics does not include prevent conflicts between the aforementioned norms. More
mobile manipulation systems as a separate domain but rather recently, the American National Standard (ANS) published
combined in the overall statistics according to application the ANSI/RIA R15.08-1-2020, targeting specifically safety
areas (industrial, logistics, medical, field robotics, defense, requirements for industrial mobile robots. On the other hand,
etc.) [39]. However, it is recognized that the combination of healthcare applications may require ISO 13482 about safety
mobile platforms with collaborative robots opens the door requirements for personal care robots.
to solve new use cases and could substantially increase the III. M OBILE M ANIPULATION H ACKATHON
demand of robotic systems.
The Mobile Manipulation Hackathon was conceived to en-
With the advances in development of mobile manipulators
courage participants to implement demonstrations that show-
and its wide potential of applications, comes the need for
case the applicability of a wheeled robotic manipulator. The
standardization, especially in topics related to safety in
call was open to contributions from any field (e.g. learning
14 https://www.toyota-global.com/innovation/
by demonstration, grasp planning, human-robot interaction)
partner_robot/
or domain (e.g. logistics, healthcare, service), as long as they
15 https://www.bostondynamics.com/handle could be integrated into a predefined robotic platform to
16 https://hello-robot.com/product execute a mobile manipulation application. The selection of
researchers and engineers. PAL Robotics sponsored the com-
petition by lending three TIAGo robots, available on-site dur-
ing the final event. In addition, selected teams were allowed
to spend a week testing and tuning their demonstration at
the PAL Robotics site the month before the final event.

Competition procedure
The participation in the Hackathon was an activity that
had to be prepared well in advance. With this purpose we
designed a procedure that gave the teams enough time to
develop their proof of concept, and the organizers enough
time to set up the selection procedures. The procedure
consisted of the following milestones.
• Call for participation (Early 2018). An announcement
was distributed in several mailing lists with descriptions
of the Hackathon scope, goals, procedures and timeline.
• Expression of interest (March 2018). Interested parties
submitted a letter introducing the team and present-
Fig. 2: Wheeled manipulation platform TIAGo. ing their proposed application and demo, background,
planned use of equipment, etc.
• Feedback to teams (April 2018). Organizers provided
suggestions on how to create a high impact demo.
the application and the final script of the demonstration were
• Entry Submission (June 2018). Teams submitted a video
proposed by the participant teams. The Hackathon organizers
and a short technical report explaining in detail their
evaluated and filtered the most promising and appropriate
proposed demo and their original approach/technology
proposals to ensure that they fitted the scope and purpose of
to be showcased at the Hackathon. At this stage, simu-
the activity.
lations were allowed in the video.
This methodological approach is different to most other
• Announcement of finalists (July 2018). Six finalists
competitions that are based on detailed task descriptions for
were selected from all the submissions. The selection
the participants to solve. In our experience, these approaches
criteria included maturity of the development, novelty
have the main drawback that they deliver overfitted and
and relevance of the specific components, and relevance
engineered solutions to the specified tasks that are not easily
of the application.
generalizable and therefore usually have low impact on
• Support in Barcelona (September, 2018). Finalist teams
the associated research fields. In an open domain such as
were given the opportunity to test and tune their demos
mobile manipulation we feel that this is not effective. As an
on the robot for one week at PAL Robotics headquarters.
alternative we propose an open format in which teams can
• Competition (October 1-5, 2018). The final event took
demonstrate their knowledge on tasks proposed by them.
place during the IEEE/RSJ Int. Conf. on Intelligent
Mobile Manipulation Platform Robots and Systems (IROS) in Madrid, Spain. The
event lasted three days, and two teams participated
In order to ease and motivate the participation on the each day. Teams were given the whole day with the
Hackathon we proposed a common mobile manipulation robots on-site to prepare their demonstration, which was
platform, TIAGo by PAL Robotics17 . It is endowed with a presented in the late afternoon. A committee of three
7 DoF arm, a liftable torso, and a pan-tilt head equipped international experts comprised of Prof. Jeannette Bohg
with a RGB-D camera and stereo microphones (Fig. 2). from Stanford University, Dr. Graham Deacon from
Participants in the Hackathon benefited from the com- Ocado Technology and Prof. Weiwei Wan from Osaka
pletely ROS-based interfaces, and a simulation environment University, evaluated the demonstrations. The criteria
to develop, in their own labs, an initial proposal for their for the evaluation were novelty, academic merit, indus-
demonstration. The demo was required to necessarily and trial merit, quality of the integration and impressiveness
effectively use the potential of a mobile robot (e.g. the of the demonstration. The winners were announced at
proposed demonstration could not be solved with a fixed- the end of the third day.
base manipulator only). Participants could exploit the ROS
tutorials and demonstrations publicly available18 . Competition results
Applications developed in simulation were later imple-
mented on the real robot with the support of PAL Robotics Thirteen teams submitted entries. These teams came from
countries worldwide (India (2), Germany (2), Spain (3),
17 http://pal-robotics.com/robots/tiago Switzerland (1), Singapore (1), Japan (1), Brazil (1), Mex-
18 http://wiki.ros.org/Robots/TIAGo ico (1), USA (1)) and proposed an extensive variety of ap-
Team name Affiliation Country # Members Demo
Homer Team Koblenz University Germany 2 Imitation learning of human actions
www.youtube.com/watch?v=Pf91wv2ddQE
Robotics.SG Nanyang Technological University, Pana- Singapore 6 Item placing in an e-commerce warehouse
sonic R&D Centre Singapore, Hand Plus www.youtube.com/watch?v=_3wZ3J6NWCc
Robotics, and Panasonic Connected Solu-
tions Company
IRI Technical University of Catalunya/Spanish Spain 3 Adaptive robotic feeding assistance
National Research Council www.youtube.com/watch?v=dM9DoZ2z6To
PMM Tohoku Tohoku University Japan 5 Dexterous liquid pouring in a domestic situation
TAMS Hamburg University Germany 5 TIAGo as a bartender
www.youtube.com/watch?v=AOkhmyDtDfQ
IOC-AUDECO Technical Unversity of Catalunya/Institute Spain 10 TIAGo serving drinks
of Industrial and Control Engineering www.youtube.com/watch?v=VocnVbh5Nq8

TABLE I: Finalist teams.

plications, as listed below (some applications were proposed items, identified the objects inside the tray (verifying
by several teams). as well that the tray was not empty), and planned the
• Imitation learning of manipulation tasks required motions to put the items back in the corre-
• Robotic home assistant sponding shelf. During setup, the robot scanned and pre-
• Robotic assistant in a hospital stored a map of the area, including the location of items
• Robotic feeding assistant on the different shelves. The item identification was
• Autonomous mechanic assistant performed using a pre-trained learning-based perception
• Autonomous librarian approach, which also delivers the pose of the object. The
• Autonomous bartender acquisition of images for training the perception system
• Gardening applications was performed using an in-house developed rotatory
• Item picking in logistics scenarios platform to scan the shape and texture of the object.
The six finalist teams are described in Table I, and pictures Once an individual object pose was defined, a grasp
of their demos are shown in Fig. 3. A video overview of the motion was planned to pick up the item. Checkpoints
competition is publicly available19 . Due to the high quality were defined to verify if a grasp was successful or not.
of the demos, the jury decided to select two winners, teams The robot then navigated to the required shelf to place
TAMS and Robotics.SG. Their demos were: each item at its intended location.
• Team TAMS: implemented a software system that con- IV. S URVEY OF THE C OMPETITION
verted TIAGo into a bartender, pouring drinks and cock-
To compare the effort for the competition and its relation
tails to clients from behind a counter (Fig. 4). The robot
to the research performed by the team, we distributed the
recognized a person sitting in a predefined location on
following survey via email to the finalist teams. The survey
the other side of the counter, and approached them
contained 19 questions, and the answers were provided in
to take their order. The robot instructed the person to
free text format.
point to their favorite drink on a typical cocktail menu,
detected the menu’s pose on the table via keypoint de- 1) Team survey
tection, and extracted the person’s fingertip via contour a) Team name
detection and heuristic filtering. The robot could detect b) Institution(s)
if the person was trying to fool it by pointing elsewhere c) Number of team members (include breakdown by
but one of the drink names. Once the desired drink was academic degree)
identified via deictic interaction, the robot proceeded to d) Previous experience in competitions
a separate table where the liquor bottles were stored, 2) Development process
and created a composite manipulation plan to retrieve a) Did you develop the system from scratch? (if not,
the required ingredients, transport them and pour them provide a previous publication if possible)
one after the other in a transparent glass in front of b) Estimated time of demo development, in person
the customer. The glass was identified using the IR months
image of the RGB-D camera. A composite motion plan
3) Demo/system description
was generated to pour a specific amount (parameterized
by duration) into the glass, without spilling during the a) Description of the demo
reaching motions. b) Sensors used for the demo (tactile, vision, micro-
• Team Robotics.SG: the robot was used to re-shelve
phones, etc.)
products that were returned to a convenience store c) Hardware adaptations/additional tools for the
(Fig. 5). The robot picked up a tray with the returned demo
d) Software framework
19 www.youtube.com/watch?v=mt7JGXHb8jQ e) Simulation tools
(a) TAMS (b) IOC-AUDECO (c) IRI

(d) Homer Team (e) Robotics.SG (f) PMM Tohoku


Fig. 3: Demos of the finalist teams in the live competition.

Fig. 4: Snapshots of demo execution for the TAMS team. From left to right: user pointing to the menu for choosing a drink,
TIAGo moving to the bar for retrieving one of the required liquors, and TIAGo pouring the (real) liquor on the glass.

Fig. 5: Snapshots of demo execution for the Robotics.SG team. From left to right: TIAGo retrieving the bin with the returned
items, TIAGo navigating the store using a pre-recorded map, and TIAGo placing one of the items on the required shelf.
f) Motion planner and navigation. The Homer team demonstrated autonomous
g) External libraries/dependencies picking and sorting of cutlery after a party (the objects
h) How much autonomy did the robot have? (full were randomly placed on the table) using semantic scene
autonomy, shared autonomy) reasoning, as the objects were not easily identifiable using
i) Type of control only depth information. A guarded motion was used to grasp
j) Was there interaction with humans? (tactile, the cutlery by first touching the table in a pre-grasp pose
voice, etc.) and then closing the fingers to grasp the object. Suitable
4) Takeaways checkpoints were provided to verify whether the grasp had
a) Which components of the system caused you the been successful. The object was then placed in a bowl located
most trouble during the competition? in a different table. The process was repeated until the
b) Did you evolve the demo after the Hackathon? table was clean. The robotics.SG team showed a re-shelving
(include reference to publications as an outcome application, as described above. The PMM Tohoku team
of the demo, if that is the case) demonstrated a liquid pouring task, with detection of the
c) What is the most important lesson taken from transparent bottle and container. This detection was based
your participation in the Hackathon? on simple segmentation techniques, fitting a plane to the
table, removing it and then fitting cylinders to the remaining
Team survey clusters of points (which represented the bottle and cup).
Among the finalists, five were university teams, while one The other three teams required some interaction with hu-
was a mixture of institutions (university, research institution mans. Team IRI showed a robot capable of feeding impaired
and companies). As a condition to enter the Hackathon, humans in a safe and delicate manner. The demo used
we limited the number of team members to 5; however, an Amazon Alexa 3G interface to request commands, e.g.
the survey reported that the real number of individual con- choice of food, and human detection to find and interact with
tributors was between 2 and 10. All the teams had some the person. The robot transported the food and placed it in the
combination of PhD and M.Sc. students, and some teams table in front of the person. An arm-mounted camera allowed
included supervisors (postdocs/professors), technicians or the robot to detect if the human was interested in eating
undergrads. From all the participants, 15% were postdocs (when the human looked toward the camera), and when this
or professors, 45% were PhD students, 35% were M.Sc and happened, it retrieved food with a spoon. Then, if the robot
undergrad students, and 5% were technicians. detected that the person opened the mouth, the person was
Four out of the six finalists had some previous experi- fed. The process continued until the person indicated to the
ence in other competitions, including the European Robotics robot that no more food was required. After this, the robot
League, RoboCup, European Robotics Challenge, Amazon removed the food from the table (and politely said goodbye
Robotics Challenge, World Robot Summit, DJI Mobile Ma- to the person). Team TAMS showed a bartending application,
nipulation Challenge, and Nvidia Jetson Challenge. However, as described above. Team IOC-AUDECO also showed a
previous experience was not a guarantee of success, as one drink serving application. In this case, the robot would first
of the two winner teams reported no previous experience in perceive the drinks available on a cluttered table, and the
robotics competitions. human could choose the desired drink among the available
ones using a tablet or keyboard. Then, the robot would plan
Development process a manipulation sequence to retrieve the desired drink from
All the finalists based their demonstration on previous the table; the plan included moving away cans that were
work, either scientific (papers or PhD theses) or techno- obstructing the path to grasp the desired drink. A randomized
logical (platform/software components developed for other physics-based motion planner introduced in [40] was used for
competitions). Four of the teams had at least one mobile this purpose. This planner permits robot-object and object-
manipulation platform in their labs. The estimated time for object interactions such that when there is no collision-free
preparing the specific Hackathon demo strongly depended path towards the object to be grasped, no explicit high-level
on the previous experience of the team, ranging from 1 to reasoning of the task is required, but possible complex multi-
9 full person months. Note however that this estimation of body dynamical interactions are evaluated using a physics
efforts is just indicative, as it was recalled after the actual engine, and considered in the expansion of a sampling-based
competition. planner. In particular, the planner enhances the state validity
checker, the control sampler and the tree exploration strategy
Demo/System description of the KPIECE kinodynamic motion planner [41].
The demos shown on the final round were a mixture of The teams based their demos mainly on the hardware and
interactive and non-interactive executions. All of the demos sensors available on TIAGo. Team IRI additionally required a
were fully autonomous, and required human intervention 6-DoF force torque sensor to guarantee a safe feeding to the
only for solving certain failures (e.g. objects out of reach, human. They also developed their own special 3D printed
failures in self-localization, or unintended collisions). The gripper adapters for assuring an easy and stable grasp on
three non-interactive demos focused on completing tasks that the cutlery. Apart from these upgrades, the capabilities of
required some sequence of object perception, manipulation, TIAGo for carrying out a collision-free navigation and arm
motion planning were used. Team IOC-AUDECO used a 4-
fingered Allegro hand instead of the default 2-finger gripper,
to show more advanced grasping capabilities. Team TAMS
required an additional HD webcam on top of TIAGo, to get
an image with enough resolution to detect the desired drink
from the menu. Team Singapore.SG added a portable table to
the robot to be able to carry the tray with the returned items.
Additionally, they modified the shelves so that their lower
part was perceived as a solid obstacle by the laser scanner
used for navigation (otherwise, the shelf would have been
missed, as the four legs are thin).
In terms of software, the developments were mainly based
on ROS, as all robot interfaces were tightly integrated
Fig. 6: Challenging areas in the Hackathon demos.
with this framework. Simulations and visualizations mostly
employed Gazebo. All the teams created specialized modules
for certain tasks required for their demo, and some teams Interestingly enough, on a survey performed on the par-
relied also on additional libraries. Team IRI used OpenFace20 ticipants of the Amazon Picking Challenge [7], perception
for face recognition, and OpenPose21 for person detection. was also identified as the most difficult component in the
Team IOC-AUDECO implemented planning in clutter using competition. Different techniques were employed by the
the Kautham Project22 . Team TAMS used the MoveIt Task teams for object detection and pose estimation: based on
Constructor23 , developed by some team members and fully features, CAD models or surface textures, learning-based
integrated in ROS, to define and plan actions consisting of detection and estimation, and registration based on fusion of
multiple interdependent subtasks. Team Singapore.SG used depth and RGB data. In some cases, challenges came from
YOLO24 for object perception, which was trained using the detection of transparent objects (bottles, glasses).
images obtained with a self-built acquisition system [42]. The Localization of the mobile base was ranked as the second
Homer team reused custom mapping and navigation tools25 most challenging area. To cope with localization problems,
previously developed for other robotic competitions. They for instance, team IOC-AUDECO relied on Aruco markers
used Mask-RCNN26 for object detection and segmentation, to enhance the robustness of the table localization. Team
which combined with planar surface segmentation, helped to Robotics.SG wrapped paper around the shelves legs to fa-
detect the cuttlery. cilitate mapping, navigation and localization of the mobile
For control, most teams relied on open-loop position-based base.
execution of planned sequences, followed by a verification We were also interested in finding out if the experience
stage using TIAGo’s sensors (joint encoders, vision) to gained from the Hackathon was exploited afterwards in some
decide if the plan was executed as intended. Team IRI used way, or if it was an isolated effort. From the four teams
a force-based control loop to control the robotic arm while that provided an answer to this question, three indicated that
the feeding action was in progress. Team Koblenz integrated they evolved some of the components used in the demo
continuous current measurements into the grasping approach either to create a more advanced lab demo (teams IOC-
to detect contact with the table. Interestingly, no team used AUDECO and TAMS), or to reuse some solutions for a
visual servoing techniques for controlling the manipulation new competition (Homer team). The demo from team TAMS,
actions. This indicates the focus on restricted scenarios with for instance, was transferred to a different platform, a PR2
quasi-static assumptions or that explicitly required human robot, thus showing the generality of their solution27 . Three
cooperation. of the teams (IRI, IOC-AUDECO and TAMS) indicated that
Takeaways some of the demo components were further developed and
were already published or are submitted for publication as
We asked the teams to identify the most troublesome com- scientific papers. The IRI team has been able to transfer
ponents for their demonstration. Each team could identify the knowledge gained with the force loop controller used
any number of challenging areas; Fig. 6 summarizes the in the feeding task to a new scenario involving bimanual
responses. The most problematic area was object detection. cloth manipulation [43]. The IOC-AUDECO team continued
20 https://cmusatyalab.github.io/openface/ the development of the task and motion planning for mobile
21 https://github.com/CMU-Perceptual-Computing-Lab/ manipulation executions [44]. The TAMS team further de-
openpose veloped the perception of objects used in the competition, to
22 https://sir.upc.edu/projects/kautham/
detect and reconstruct transparent objects [45].
23 https://github.com/ros-planning/moveit_task_
We finally asked the teams what was the most important
constructor
24 https://github.com/pjreddie/darknet/wiki/YOLO: lesson they learned from the Hackathon. Team IRI high-
-Real-Time-Object-Detection lighted the need for further supervision during the demo
25 https://github.com/homer-robotics
26 https://github.com/matterport/Mask_RCNN 27 https://www.youtube.com/watch?v=8S2MvKNbwmM
execution. They report that as a lesson-learned, their current From the perspective of robot manufacturers, the
demos are now carefully designed to accommodate double- Hackathon was also a great opportunity to gather valuable
check control at different levels of their execution. In this feedback from both experienced and novel users of the
line, team IOC-AUDECO identified the need for more robust robots, which helps to improve how the next generation of
error detection and recovery strategies to resume tasks and robots is conceived. The research community can also benefit
recover from unexpected situations during executions. Team from this kind of competitions to identify tools, libraries and
TAMS highlighted the benefits of integrating independent frameworks that could help accelerate the implementation
components in a unified demo, and recognized the need for of real-world applications with complex robots like mobile
intensive testing of each component before the integration manipulators. As an example of this, one of the perspectives
to avoid more difficult debugging of the overall execution. for mobile manipulators is the adoption in the coming years
Team Homer appreciated the benefit of having on-site robotic of ROS2, which will provide better and more efficient data
platforms for implementing the demo out of their original distribution among processes, support to coordination of
lab, thus reducing funding needs and transportation/insurance multiple robots, security, real-time control, among others.
costs for the participating teams. Also, they highlighted the
benefits of having a common robotic platform for increasing Applications of mobile manipulation
comparability of results across multiple research groups.
Mobile manipulators are becoming increasingly available,
V. D ISCUSSION AND O UTLOOK and have a huge potential to provide cost-effective solutions
in different scenarios, including for instance industrial au-
In this final section we discuss the lessons learned after
tomation, manufacturing, logistics, healthcare, teleassistance,
organizing the Hackathon, and the outlook for similar future
and crop harvesting. In many scenarios, robots will replace
events.
humans in dull, dirty, dangerous and difficult tasks, for
instance in bomb disposal operations or handling biological
Hackathon structure
samples, as demanded now in times of pandemic. But as
The Mobile Manipulation Hackathon challenged the com- we saw during the Hackathon, a huge potential also lies
munity to show integrated demos that exploited the benefits in collaborative applications, where robots either try to
of a mobile robotic manipulation platform. This required efficiently share their workspace or physically cooperate with
development and/or integration of components at different humans in a delicate manner. Pouring liquid into a glass,
levels, e.g. perception, navigation and localization, grasp and serving a drink or feeding a person are clear examples of it.
manipulation planning, human-robot interaction. More interesting and complex applications with autonomous
The teams were free to propose a demo script, and they bi-manual, rigid or deformable object manipulation tasks can
used this opportunity to showcase not a fixed task, but their be even considered if more than one mobile manipulator is
latest developments in the above mentioned fields. This was simultaneously used, or if a dual arm mobile manipulator is
a key difference of our Hackathon, when compared with employed.
competitions where the task is fixed. We believe both types
of events are beneficial for the field: competitions with a Further technical advances required
fixed task provide a more clear picture of the progress on
that particular task. Competitions with an open-task structure, As mobile manipulators are complex systems that encom-
such as ours, are useful to understand the variety of possible pass different areas, they benefit from advances in those
applications. Therefore, we encourage the community to, and fundamental topics, including perception, localization and
we intend to, organize both types of competitions in the navigation, and overall software integration and reliability,
future. which we also identified as critical topics in our competi-
tion results (Fig. 6). Some of the challenges are platform-
Use of a fixed demonstrator platform dependent, including for instance robustness in commu-
The opportunity to use a unified HW/SW platform based nication (robust and reliable wireless communications are
on ROS provided the chance to compare multiple ap- required), integration of third-party hardware and/or soft-
proaches. A solid software and simulation framework al- ware, and kinematics (e.g. simplicity to obtain a closed-form
lowed the teams to remotely develop their demo, thus reduc- inverse-kinematics solution). On the other hand, some other
ing the time required for physical integration in the robotic issues can be considered as general mobile manipulation
system. However, we recognize that the basic tools for fast difficulties, including the following:
prototyping and quick debugging still need to be enhanced • Localization: precise location procedures within the
to enable integration of full systems with few days of access robot’s environment.
to the demonstrator platform. In terms of the competition, • Perception: robust identification of the objects and esti-
it was greatly beneficial to have the robots on-site, thus mation of their poses, using different sensors, including
relieving teams from the burden of worrying about trans- hand-held cameras for visual-servoing purposes.
portation costs, insurances, basic set up and infrastructure, • Grasping: automatic determination of grasp configura-
and allowing them to focus on the pure development process. tions taking into account the scene.
• Motion planning: capacity of planning collision-free tion. Far more challenging than fixed-based manipulation,
motions as well as motions that require contact, in order mobile manipulation holds the potential for being a disrup-
to perform push actions. tive advance in robotics for applications at multiple levels,
• Task planning: automatic determination of the sequence from industrial to home and healthcare environments. Open-
of actions to perform the manipulation task; it may in- challenge hackathons/competitions targeting mobile manip-
clude regrasping actions and the need to simultaneously ulation would continue to serve the field and the community
consider the planning of the motions. in the future.
• Reasoning: need of reasoning capabilities to understand
the situation and accordingly tune all the previously VI. ACKNOWLEDGMENT
stated issues.
• Failure detection and recovery: Use of reasoning capa- The authors would like to thank the Organizing Committee
bilities for failure detection and selection of recovery of IROS 2018 in Madrid, Spain, and in particular Prof. Carlos
strategies. Balaguer, General Chair, for supporting the development of
If robots are to enter more complex scenarios such as a this Hackathon. We would also like to thank Prof. Jeannette
warehouse, grasping and manipulation capabilities must be Bohg, Dr. Graham Deacon and Prof. Weiwei Wan, for their
greatly improved, as robots must show capabilities to handle great support serving as judges for the final competition
a huge variety of products in terms of size, weight, textures, in Madrid. Additionally, we thank Ocado Technologies and
rigidity, located in different types of containers, bins or University of Leeds for sponsoring the awards, and PAL
shelves, especially in densely packed or cluttered scenarios. Robotics for providing the TIAGo platforms and facilitating
This requires naturally further integration of tactile sensing, the training at their premises. And naturally, we recognize
visual servoing, and in general fusion of multiple sensing and greatly appreciate the interest and effort of all the
modalities to enhance the awareness of the robot. participant teams in this Mobile Manipulation Hackathon.
As the competition called for system-level demos, a suc- M. Dogar received funding from the UK Engineer-
cessful execution depended on multiple components running ing and Physical Sciences Research Council under grant
simultaneously. Inevitably, failure rates multiply in such EP/P019560/1. A. Morales and the UJI Robotic Intelligence
complex scenarios, and success requires a heightened aware- Laboratory are partially funded by Ministerio de Economı́a
ness of failure sources and handling of non-prototypical y Competitividad (DPI2017-89910-R) and by Generalitat
situations. In other words, reliability of the platforms must Valenciana (PROMETEO/2020/034). The work developed by
be enhanced, and they must be endowed with advanced error team IOC-AUDECO was partially supported by the Spanish
detection and recovery capabilities. Government through the project DPI2016-80077-R. Team
Speed of execution is also a pending topic. During the IRI was supported by the European Research Council (ERC)
Hackathon demos, the robots took several minutes to perform from the European Union H2020 Programme under grant
actions that a human could do in a matter of seconds. 741930 (CLOTHILDE: CLOTH manIpulation Learning from
Autonomy while working on batteries was not an issue with DEmonstrations), and the Spanish State Research Agency
the demos, as they were relatively short (below 10 minutes in through the Marı́a de Maeztu Seal of Excellence to IRI
total for a full run), but it will be critical in real applications (MDM-2016-0656). Team TAMS was partially funded by
where the robots must be available during extended periods the German Research Foundation (DFG) and the National
of time. Science Foundation of China in project Crossmodal Learn-
A proper exploitation of the whole-body coordination to ing, TRR-169.
simultaneously employ the mobile base and the manipulator
while performing the intended task is also required [46]. This R EFERENCES
has not only implications in terms of how to effectively use
[1] G. Briscoe and C. Mulligan, “Digital innovation: The hackathon
the multiple DoFs and redundancy of these platforms, but phenomenon,” Creativeworks London, 2014.
also in terms of standardization and certification, essential to [2] E. H. Trainer, A. Kalyanasundaram, C. Chaihirunkarn, and J. D.
guarantee safety for applications of such systems especially Herbsleb, “How to hackathon: Socio-technical tradeoffs in brief,
intensive collocation,” in Proc. ACM Conf. on computer-supported
in human-robot collaborative scenarios. cooperative work & social computing, 2016, pp. 1118–1130.
The issues above (i.e. multi-modal perception, manip- [3] H. Kienzler and C. Fontanesi, “Learning through inquiry: A global
ulation planning and reasoning, system-level integration, health hackathon,” Teaching in Higher Education, vol. 22, no. 2, pp.
129–142, 2017.
speed of execution, and whole-body coordination) continue
[4] J. R. Byrne, K. O’Sullivan, and K. Sullivan, “An IoT and wearable
to be the main challenges in mobile manipulation systems, technology hackathon for promoting careers in computer science,”
as also observed during other recent mobile manipulation IEEE Trans. on Education, vol. 60, no. 1, pp. 50–58, 2016.
competitions [10], [11], [13]. A more recent development [5] I. Nourbakhsh, S. Morse, C. Becker, M. Balabanovic, R. Simmons,
S. Goodridge, H. Potlapalli, D. Hinkle, K. Jung, and D. Van Vactor,
is the introduction of competitions that focus on learning- “The winning robots from the 1993 robot competition,” AI Magazine,
based approaches, e.g. the Real Robot Challenge by MPI- vol. 14, no. 4, pp. 51–51, 1993.
IS in 2020. This follows the general trend of merging [6] S. Thrun, M. Montemerlo, H. Dahlkamp, D. Stavens, A. Aron,
J. Diebel, P. Fong, J. Gale, M. Halpenny, G. Hoffmann, et al., “Stanley:
Robotics and AI, but these competitions currently focus on The robot that won the DARPA grand challenge,” J. of Field Robotics,
manipulation-only tasks, as opposed to mobile manipula- vol. 23, no. 9, pp. 661–692, 2006.
[7] N. Correll, K. Bekris, D. Berenson, O. Brock, A. Causo, K. Hauser, [27] T. Lueth, U. Nassal, and U. Rembold, “Reliability and integrated
K. Okada, A. Rodriguez, J. Romano, and P. R. Wurman, “Analysis and capabilities of locomotion and manipulation for autonomous robot
observations from the first Amazon Picking Challenge,” IEEE Trans. assembly,” J. Robotics and Autonomous Systems, vol. 14, pp. 185–
on Automation Science and Engineering, vol. 15, no. 1, pp. 172–188, 198, 1995.
2016. [28] J. Cameron, D. MacKenzie, K. Ward, R. Arkin, and W. Book,
[8] F. Von Drigalski, C. Schlette, M. Rudorfer, N. Correll, J. C. Triy- “Reactive control for mobile manipulation,” in Proc. IEEE Int. Conf.
onoputro, W. Wan, T. Tsuji, and T. Watanabe, “Robots assembling Robotics and Automation - ICRA, 1993, pp. 228–235.
machines: learning from the World Robot Summit 2018 Assembly [29] O. Khatib, K. Yokoi, K. Chang, D. Ruspini, R. Holmberg, A. Casal,
Challenge,” Advanced Robotics, vol. 34, no. 7-8, pp. 408–421, 2020. and A. Baader, “Force strategies for cooperative tasks in multiple mo-
[9] Y. Sun, J. Falco, N. Cheng, H. R. Choi, E. D. Engeberg, N. Pollard, bile manipulation systems,” in Proc. Int. Symp. of Robotics Research,
M. A. Roa, and Z. Xia, “Robotic grasping and manipulation compe- 1995.
tition: task pool,” in Robotic Grasping and Manipulation Challenge. [30] M. Hvilshøj, S. Bøgh, O. S. Nielsen, and O. Madsen, “Autonomous
Springer, 2016, pp. 1–18. industrial mobile manipulation (AIMM): past, present and future,”
[10] Z. Han, J. Allspaw, G. LeMasurier, J. Parrillo, D. Giger, S. R. Ah- Industrial Robot: An International Journal, vol. 39, no. 2, pp. 120–
madzadeh, and H. A. Yanco, “Towards mobile multi-task manipulation 135, 2012.
in a confined and integrated environment with irregular objects,” in [31] J. Kuehnle, A. Verl, Z. Xue, S. Ruehl, J. M. Zoellner, R. Dillmann,
Proc. IEEE Int. Conf. Robotics and Automation - ICRA, 2020. T. Grundmann, R. Eidenberger, and R. D. Zoellner, “6D object local-
[11] J.-B. Yi and S.-J. Yi, “Mobile manipulation for the HSR intelligent ization and obstacle detection for collision-free manipulation with a
home service robot,” in Proc. Int. Conf. on Ubiquitous Robots - UR, mobile service robot,” in Proc. IEEE Int. Conf. on Advanced Robotics,
2019, pp. 169–173. 2009, pp. 1–6.
[12] M. Matamoros, C. Rascon, S. Wachsmuth, A. W. Moriarty, J. Kum- [32] J. Bohren, R. Rusu, E. Gil, E. Marder-Eppstein, C. Pantofaru, M. Wise,
mert, J. Hart, S. Pfeiffer, M. van der Brugh, and M. St-Pierre, L. Mösenlechner, W. Meeussen, and S. Holzer, “Towards autonomous
“Robocup@home 2019: Rules and regulations,” 2019. robotic butlers: Lessons learned with the PR2,” in Proc. IEEE Int.
[13] J. Huth, “Taking robots shopping,” Nature Machine Intelligence, vol. 1, Conf. Robotics and Automation - ICRA, 2011, pp. 5568–5575.
no. 11, pp. 545–545, 2019. [33] M. A. Diftler, J. Mehling, M. E. Abdallah, N. A. Radford, L. B.
[14] R. Awad, F. Caccavale, and A. J. van der Meer, “Evaluation and Bridgwater, A. M. Sanders, R. S. Askew, D. M. Linn, J. D. Yamokoski,
selection activities in EuRoC: Innovations and lessons learned,” in F. Permenter, et al., “Robonaut 2-the first humanoid robot in space,”
Bringing Innovative Robotic Technologies from Research Labs to in Proc. IEEE Int. Conf. Robotics and Automation - ICRA, 2011, pp.
Industrial End-users. Springer, 2020, pp. 15–34. 2178–2183.
[15] F. Ruggiero, V. Lippiello, and A. Ollero, “Aerial manipulation: A [34] B. Graf, U. Reiser, M. Hägele, K. Mauz, and P. Klein, “Robotic home
literature review,” IEEE Robotics and Automation Letters, vol. 3, no. 3, assistant Care-O-bot R 3 - product vision and innovation platform,” in
pp. 1957–1964, 2018. IEEE Workshop on Advanced Robotics and its Social Impacts, 2009,
[16] M. A. Roa, K. Nottensteiner, A. Wedler, and G. Grunwald, “Robotic pp. 139–144.
technologies for in-space assembly operations,” in Proc. Symp. on [35] S. S. Srinivasa, D. Ferguson, C. J. Helfrich, D. Berenson, A. Collet,
Advanced Space Technologies in Robotics and Automation - ASTRA, R. Diankov, G. Gallagher, G. Hollinger, J. Kuffner, and M. V. Weghe,
2017. “HERB: a home exploring robotic butler,” Autonomous Robots,
[17] A. Gomez-Tamm, V. Perez-Sanchez, B. Arrue, and A. Ollero, “SMA vol. 28, pp. 5–20, 2010.
actuated low-weight bio-inspired claws for grasping and perching [36] C. Borst, T. Wimböck, F. Schmidt, M. Fuchs, B. Brunner, F. Zacharias,
using flapping wing aerial systems,” in Proc. IEEE/RSJ Int. Conf. on P. R. Giordano, R. Konietschke, W. Sepp, S. Fuchs, C. Rink, A. Albu-
Intelligent Robots and Systems - IROS, 2020, pp. 8807–8814. Schäffer, and G. Hirzinger, “Rollin’justin-mobile platform with vari-
[18] S. Hamaza, I. Georgilas, G. Heredia, A. Ollero, and T. Richardson, able base,” in Proc. IEEE Int. Conf. Robotics and Automation - ICRA,
“Design, modeling, and control of an aerial manipulator for placement 2009, pp. 1597–1598.
and retrieval of sensors in the environment,” J. Field Robotics, vol. 37, [37] T. Asfour, K. Regenstein, P. Azad, J. Schroder, A. Bierbaum,
no. 7, pp. 1224–1245, 2020. N. Vahrenkamp, and R. Dillmann, “ARMAR-III: An integrated hu-
[19] A. Ollero, G. Heredia, A. Franchi, G. Antonelli, K. Kondak, A. Sanfe- manoid platform for sensory-motor control,” in Proc. IEEE-RAS Int.
liu, A. Viguria, J. R. Martinez-de Dios, F. Pierri, J. Cortés, et al., “The Conf. on Humanoid Robots, 2006, pp. 169–175.
AEROARMS project: Aerial robots with advanced manipulation capa- [38] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs,
bilities for inspection and maintenance,” IEEE Robotics & Automation R. Wheeler, and A. Ng, “ROS: an open-source Robot Operating
Magazine, vol. 25, no. 4, pp. 12–23, 2018. System,” in ICRA workshop on open source software, 2009.
[20] M. Zhao, F. Shi, T. Anzai, K. Okada, and M. Inaba, “Online motion [39] IFR - International Federation of Robotics, World Robotics 2019
planning for deforming maneuvering and manipulation by multilinked Service Robots, 2019.
aerial robot based on differential kinematics,” IEEE Robotics and [40] Muhayyuddin, M. Moll, L. Kavraki, and J. Rosell, “Randomized
Automation Letters, vol. 5, no. 2, pp. 1602–1609, 2020. physics-based motion planning for grasping in cluttered and uncertain
[21] D. Youakim, P. Ridao, N. Palomeras, F. Spadafora, D. Ribas, and environments,” IEEE Robotics and Automation Letters, vol. 3, no. 2,
M. Muzzupappa, “Moveit!: autonomous underwater free-floating ma- pp. 712–719, 2018.
nipulation,” IEEE Robotics & Automation Magazine, vol. 24, no. 3, [41] I. Sucan and L. E. Kavraki, “A sampling-based tree planner for systems
pp. 41–51, 2017. with complex dynamics,” IEEE Trans. Robotics, vol. 28, no. 1, pp.
[22] S. Sivcec, J. Coleman, E. Omerdic, G. Dooley, and D. Toal, “Under- 116–131, 2012.
water manipulators: A review,” Ocean Engineering, vol. 163, no. 1, [42] Z. Chong, R. Luxman, W. Pang, Z. Yi, R. Meixuan, H. Suratno,
pp. 431–450, 2018. A. Causo, and I. Chen, “An innovative robotic stowing strategy for
[23] A. Kheddar, S. Caron, P. Gergondet, A. Comport, A. Tanguy, inventory replenishment in automated storage and retrieval system,”
C. Ott, B. Henze, G. Mesesan, J. Englsberger, M. Roa, P. Wieber, in Proc. Int. Conf. on Control, Automation, Robotics and Vision -
F. Chaumette, F. Spindler, G. Oriolo, L. Lanari, A. Escande, K. Chapel- ICARCV, 2018.
let, F. Kanehiro, and P. Rabate, “Humanoid robots in aircraft man- [43] I. Garcia-Camacho, M. Lippi, M. Welle, H. Yin, R. Antanova, A. Var-
ufacturing: The Airbus use cases,” IEEE Robotics and Automation ava, J. Borràs, C. Torras, A. Marino, G. Alenyà, and D. Kragic,
Magazine, vol. 26, no. 4, pp. 30–45, 2019. “Benchmarking bimanual cloth manipulation,” IEEE Robotics and
[24] E. Krotkov, D. Hackett, L. Jackel, M. Perschbacher, J. Pippine, Automation Letters, vol. 5, no. 2, pp. 1111–1118, 2020.
J. Strauss, G. Pratt, and C. Orlowski, “The DARPA robotics challenge [44] S. Saoji and J. Rosell, “Flexibly configuring task and motion plan-
finals: Results and perspectives,” J. of Field Robotics, vol. 34, no. 2, ning problems for mobile manipulators,” in Proc. Annual Conf. on
pp. 229–240, 2017. Emerging Technologies and Factory Automation - ETFA, 2020, pp.
[25] J. Schuler, Integration von Förder-und Handhabungseinrichtungen. 1285–1288.
Springer-Verlag, 2013, vol. 104. [45] P. Ruppel, M. Görner, N. Hendrich, and J. Zhang, “Detection and
[26] C. Weisbin, B. Burks, J. Einstein, R. Feezell, W. Manges, and reconstruction of transparent objects with infrared projection-based
D. Thompson, “HERMIES-III: A step toward autonomous mobility RGB-D cameras,” in Proc. Int. Conf. on Cognitive Systems and
manipulation and perception,” Robotica, vol. 8, pp. 7–12, 1990. Information Processing - ICCSIP, 2020.
[46] K. Harada and M. A. Roa, “Manipulation and task execution by
humanoids,” in Humanoid Robotics: A Reference. Springer, 2019,
pp. 1633–1655.

You might also like