Electronics: Open-Source Drone Programming Course For Distance Engineering Education
Electronics: Open-Source Drone Programming Course For Distance Engineering Education
Article
Open-Source Drone Programming Course for
Distance Engineering Education
José M. Cañas 1, * , Diego Martín-Martín 2 , Pedro Arias 3 , Julio Vega 1 ,
David Roldán-Álvarez 1 , Lía García-Pérez 4 and Jesús Fernández-Conde 1
1 Department of Telematic Systems and Computation, Rey Juan Carlos University, 28942 Madrid, Spain;
[email protected] (J.V.); [email protected] (D.R.-Á.); [email protected] (J.F.-C.)
2 Electronic Technology Area, Rey Juan Carlos University, 28933 Madrid, Spain; [email protected]
3 JdeRobot Organization, Alcorcón, 28922 Madrid, Spain; [email protected]
4 Industrial Engineering Department, Francisco de Vitoria University, Pozuelo de Alarcón,
28223 Madrid, Spain; [email protected]
* Correspondence: [email protected]; Tel.: +34-914-888-755
Received: 12 November 2020; Accepted: 14 December 2020; Published: 17 December 2020
Abstract: This article presents a full course for autonomous aerial robotics inside the RoboticsAcademy
framework. This “drone programming” course is open-access and ready-to-use for any teacher/student
to teach/learn drone programming with it for free. The students may program diverse drones on their
computers without a physical presence in this course. Unmanned aerial vehicles (UAV) applications
are essentially practical, as their intelligence resides in the software part. Therefore, the proposed
course emphasizes drone programming through practical learning. It comprises a collection of
exercises resembling drone applications in real life, such as following a road, visual landing,
and people search and rescue, including their corresponding background theory. The course has
been successfully taught for five years to students from several university engineering degrees.
Some exercises from the course have also been validated in three aerial robotics competitions,
including an international one. RoboticsAcademy is also briefly presented in the paper. It is an
open framework for distance robotics learning in engineering degrees. It has been designed as
a practical complement to the typical online videos of massive open online courses (MOOCs).
Its educational contents are built upon robot operating system (ROS) middleware (de facto standard in
robot programming), the powerful 3D Gazebo simulator, and the widely used Python programming
language. Additionally, RoboticsAcademy is a suitable tool for gamified learning and online robotics
competitions, as it includes several competitive exercises and automatic assessment tools.
Keywords: distance learning; open educational platform; drone programming; gamification; Python;
ROS middleware
1. Introduction
The impact of unmanned aerial vehicles (UAVs) in daily life has primarily grown in the last decade.
Beyond its classic employment in aerial photography or geographic mapping, an increasing number
of drone-based solutions hit the general market and make human life more comfortable every year.
Drones are increasingly being used for video recordings, wildlife monitoring, precision agriculture,
disaster management, entertainment, industrial inspections, etc.
UAV technology’s growth has created the need to train professionals in the sector, taking the
current limits further and producing new drone applications to serve the general public. UAV design
is a cross-disciplinary field involving many technologies: electronics, mechanics, computer science,
telecommunications, etc. Drones are composed of hardware (sensors, actuators, processors) and
software. Drone programming allows more autonomous applications and widens the span of tasks
that can be solved with them. A significant part of drone intelligence and value resides in its software.
Therefore, drone programming abilities are increasingly demanded in the job market and increasingly
taught in higher education.
Drone programming is rapidly becoming a key market for graduates in computer science and
diverse university engineering degrees. Beyond classroom learning at universities, many private
companies (such as Udacity, Udemy, TheConstruct, etc.) and universities themselves provide online
courses to learn how to program a drone. Massive online open courses (MOOC) explore new
possibilities for distance learning, taking advantage of the Internet. They typically include video
classes, slides, and educational documents.
MOOCs provide useful videos explaining theory contents and drone programming foundations
in distance learning, but they typically undershoot the practical side. This aspect is critical in this
area, where the “learn by doing” paradigm is the most effective approach and provides the best
learning results.
This article presents an open-source drone programming course based on RoboticsAcademy,
an open-access educational platform for robotics distance learning. The course’s primary goal
is to provide an open-access tool that facilitates drones’ programming in different scenarios,
applying concepts related to computer vision, artificial intelligence, automation, autonomous
navigation, or control algorithms. The course is ready to be used for drone programming distance
learning in higher education in engineering. It provides a set of exercises covering different real-world
drone applications (e.g., search and rescue, following a road or visual landing) and focusing on the
practical side. Additionally, RoboticsAcademy is a suitable platform for online drone programming
competitions and gamified learning, as it includes several online competitive exercises.
Typical drone systems for commercial applications are composed of a flying vehicle and a ground
station. In some classic applications, drones are teleoperated by a human operator using a remote with
a radio link. In contrast, they follow preplanned missions in others, sent from a ground station and
composed of a sequence of waypoints. Once the task is set, the onboard autopilot can command the
drone movements to automatically follow the required track using a GPS sensor, inertial measurement
unit (IMU) sensors, and a fast control loop. Two well-known autopilot firmware platforms are PX4
and ArduPilot, which communicate to the ground station using the standard MAVLink protocol.
In the last few years, drone applications have been increasing their scope, and more onboard
intelligence and sensors, such as cameras, are required. Industrial inspection or autonomous indoor
mapping are two illustrative examples. UAVs are progressively seen as aerial robots, and therefore the
most popular robot software frameworks have started to support them. For instance, the robot operating
system (ROS), the de facto standard in robotics, includes the MAVLink extendable communication node
for ROS package (MAVROS), which allows the command of missions and obtaining feedback from
MAVLink drones. This trend democratizes access to drone programming by introducing a standard
layer of abstraction and many reusable components. ROS provides a general framework to program
the autonomous drone logic and facilitates integrating already developed software pieces (drivers,
libraries, stacks, nodes, etc.). It undoubtedly reduces the entry barrier, also helping to migrate software
from one drone to another.
Since ROS was not conceived as an educative platform, it lacks specific educative contents to be
followed in the classroom or assessment tools for student’s code. Another problem of using ROS as
a learning tool is the wide variety of possibilities provided and its ecosystem’s overwhelming size:
packages, libraries, tools, drivers, etc., causing a big “learning shock” and generating frustration. It is
probably the best choice for drone programming, but it is not enough on its own as an educative
platform. RoboticsAcademy aims to provide such educative content using ROS as drone middleware.
The remainder of this paper is organized as follows: the next section provides a summarized
review of related work. In Section 3, we discuss the main features and internal architecture of
the RoboticsAcademy platform in detail. The fourth section presents the distance-learning drone
Electronics 2020, 9, 2163 3 of 18
programming course based on RoboticsAcademy. This platform’s usage as a tool for online drone
programming competitions and gamified learning is described in Section 5. Finally, Section 6 draws
the main conclusions and points out future lines of research.
2. Related Work
There are two trends in teaching robotics contents at university: subjects focused on industrial
robots [1–7] and subjects based on mobile robots [8–13]. In the first case, manipulation control
techniques, inverse kinematics, or trajectory calculations are usually addressed. In the second case,
the techniques generally explained are local and global navigation, position control (avoiding obstacles),
perception, self-location, etc. This dichotomy of contents is also reflected in the robotic platforms that
are used in the exercises.
Both types of robotic subjects have a notably practical bias. Student interaction with robots
facilitates the learning and assimilation of the theoretical concepts, algorithms, and techniques [14].
Exercises promote the paradigm of learning through doing, that is, active learning. They usually occur
within a specific robotic framework, using a teaching tool or a teaching suite where robot behavior is
programmed using a particular language. A great diversity of hardware platforms is used to support
the experimental component of robotics courses.
Beyond being a widespread platform in robotics education for pre-university students,
LEGO MINDSTORMS robots such as NXT or EV3 have also been used in many university courses [15–17].
As recent examples, LEGO robots are used to teach motion planning [18], physical manipulation [19],
and control systems [20] in real robots. These works combine LEGO robots and MATLAB.
According to Esposito [21], MATLAB and C language remain the dominant choices in programming
environments. A minority uses free, open-source packages such as the robot operating system (ROS)
or OpenCV.
This trend in web-based and practical educational frameworks also appears in other related
fields. For instance, Google Colaboratory (https://colab.research.google.com) provides a simple-to-use
infrastructure where the students can learn deep learning using Jupyter notebooks from their web
browser, which are run on the Google servers. No installation by the students is required.
In robotics, this is complemented by using robotic simulators through a web interface. For example,
one crucial online teaching initiative is Robot Ignite Academy from TheConstruct [28]. It is based on
ROS, the Gazebo simulator, and Jupyter, providing several short courses. Another is RobotBenchmark
(https://robotbenchmark.net) from Cyberbotics. Based on the Webots simulator, it is open and provides
twelve Python exercises with several robots such as Thymio II and Nao humanoid.
Another important ROS-based initiative is Robot Programming Network [29–31]. This action
extends existing remote robot laboratories with the flexibility and power to write ROS code in a web
browser and run it in the robot on the server-side with a single click. It uses Moodle, VirtualBox,
HTML, and Robot Web Tools as its underlying infrastructure.
Recent Internet technologies open new possibilities for distance learning in robotics. For instance,
in ROSLab [32], ROS has been combined with JupyterLab technology to integrate documentation and
robotics software, and Docker container technology. JupyterLab is very useful for teaching purposes,
as lessons or exercises may be delivered as Jupyter notebooks. These notebooks are programmed and
run locally by students, without any dependency problems, and in any operating system, just inside
a container. Another relevant example is cloud computing. For instance, Amazon RoboMaker
(https://aws.amazon.com/robomaker/) [33] allows cloud robotics, simulating, and deploying robotic
applications at a cloud-scale, which may then be available for distance students. It provides an
integrated development environment, simulation service, and fleet management capabilities.
Additionally, some drone-based tutorials have begun to appear in some outstanding conferences.
For instance, the “Drone Vision and Deep Learning” tutorial [43] by Ioannis Pitas, from the IEEE/CVF
Conference on Computer Vision and Pattern Recognition CVPR2020, focuses on advanced computer
Electronics 2020, 9, x FOR PEER REVIEW 5 of 18
vision techniques to be run onboard UAVs. They deal with semantic world mapping, detection of
target/obstacle/crowd/point of interest, and 2D/3Dof
mapping, detection of target/obstacle/crowd/point target tracking.
interest, Moreover,
and 2D/3D they facilitate
target tracking. drone
Moreover,
autonomy as drone
they facilitate vision
drone plays aaspivotal
autonomy drone role in drone
vision plays aperception/control, coping
pivotal role in drone with the limited
perception/control,
computing resources available onboard.
coping with the limited computing resources available onboard.
3. RoboticsAcademy
3. RoboticsAcademyLearning
LearningEnvironment
Environment
RoboticsAcademy
RoboticsAcademy(http://jderobot.github.io/RoboticsAcademy/)
(http://jderobot.github.io/RoboticsAcademy/) isisan anopen-access
open-accessplatform
platform
containing a collection of self-contained exercises to learn robotics and computer vision practically,
containing a collection of self-contained exercises to learn robotics and computer vision practically,
oriented to engineering
oriented to education. The proposed
engineering activities (http://jderobot.github.io/RoboticsAcademy/
education. The proposed activities
exercises/) cover a wide range of topics, including autonomous
(http://jderobot.github.io/RoboticsAcademy/exercises/) cover cars, mobile
a wide robots,
range industrial
of topics, robotics,
including
andautonomous
drones. As an evolution
cars, from itsindustrial
mobile robots, initial design [44], RoboticsAcademy
robotics, and drones. As anisevolution
periodically
fromupdated with
its initial
newdesign [44], and
exercises RoboticsAcademy
functionalities.isSome
periodically updated provided
of the exercises with newby exercises and functionalities.
the RoboticsAcademy Some
educational
of the exercises
platform provided
are depicted by the
in Figure 1.RoboticsAcademy educational platform are depicted in Figure 1.
In each exercise, the student is challenged with a specific problem, which is expected to be solved
In each exercise, the student is challenged with a specific problem, which is expected to be solved
by programming standard robotics algorithms (e.g., perception, planning, and control) to provide
by programming standard robotics algorithms (e.g., perception, planning, and control) to provide
the robot with the necessary intelligence to complete the assigned task. Students shall program their
the robot with the necessary intelligence to complete the assigned task. Students shall program their
solutions
solutionsusing the
using thePython
Pythonprogramming
programming language, selecteddue
language, selected duetotoits
itsgradual
graduallearning
learning curve
curve andand
growing
growing popularity
popularityininrobotics
robotics[45]
[45]and
andother
otherfields.
fields.
The The platform provides a simple application programming
platform provides a simple application interface(API)
programming interface (API)totogrant
granthigh-level
high-level
access
access to robot sensors and actuators for each exercise. This simplicity allows students to focus onon
to robot sensors and actuators for each exercise. This simplicity allows students to focus thethe
algorithms’
algorithms’coding rather
coding than
rather battling
than with
battling withtime-consuming
time-consuminghardware
hardwaremanagement
managementdetails.
details.From
Froman
architectural pointpoint
an architectural of view, eacheach
of view, exercise is divided
exercise intointo
is divided three different
three layers
different (see
layers Figure
(see Figure2):2):
1. Hardware layer: represents the robot itself, being a real robot or a simulated one.
2. Middleware layer: includes the software drivers required to read information from robot sensors
and control robot actuators.
Electronics 2020, 9, 2163 6 of 18
1. Hardware layer: represents the robot itself, being a real robot or a simulated one.
2. Middleware layer: includes the software drivers required to read information from robot sensors
Electronics
and2020, 9, x FOR
control PEERactuators.
robot REVIEW 6 of 18
3. Application layer: contains the student’s code (algorithms implemented by the student to solve
3. Application
the problem layer: containsand
proposed) the astudent’s
templatecode (algorithms
(internal implemented
platform by the
functions that student
provide to solve
two simple
the problem proposed) and a template (internal platform functions that provide two simple
APIs to access robot hardware and take care of graphical issues for students’ code debugging APIs
topurposes).
access robot hardware and take care of graphical issues for students’ code debugging
purposes).
Figure
Figure 2. 2. Design
Design of of exercises
exercises inin RoboticsAcademy.
RoboticsAcademy.
3.1. Hardware Layer
3.1. Hardware Layer
As opposed to other robotics teaching platforms [11], the exercises proposed in RoboticsAcademy
As opposed to other robotics teaching platforms [11], the exercises proposed in
include a wide variety of robots (drones, mobile robots with wheels, cars, humanoids, industrial manipulators)
RoboticsAcademy include a wide variety of robots (drones, mobile robots with wheels, cars,
in real or simulated arrangements. Likewise, practical activities with popular sensors such as lasers, GPS and
humanoids, industrial manipulators) in real or simulated arrangements. Likewise, practical activities
RGB-D cameras (Kinect, Xtion) can be performed.
with popular sensors such as lasers, GPS and RGB-D cameras (Kinect, Xtion) can be performed.
The internal design of RoboticsAcademy allows the same student’s code to be run in a physical
The internal design of RoboticsAcademy allows the same student’s code to be run in a physical
robot (when available) or its simulated counterpart, with only minor configuration changes. For the
robot (when available) or its simulated counterpart, with only minor configuration changes. For the
3D simulation of the different robots and environments, the open-source Gazebo simulator [46] has
3D simulation of the different robots and environments, the open-source Gazebo simulator [46] has
been used. The Gazebo simulator is widely recognized among the international robotics community.
been used. The Gazebo simulator is widely recognized among the international robotics community.
Two different flight controllers are conventional when using drones: ArduPilot and PX4.
Two different flight controllers are conventional when using drones: ArduPilot and PX4. They
They implement the low-level control and stabilization of the flying robot, leading to flight
implement the low-level control and stabilization of the flying robot, leading to flight missions’
missions’ execution as sequences of waypoints. They run onboard the drones themselves,
execution as sequences of waypoints. They run onboard the drones themselves, allowing
allowing communication with ground control stations or controlling applications. Regarding the
communication with ground control stations or controlling applications. Regarding the presented
presented course hardware, PX4 is the flight controller supported in RoboticsAcademy, both on real
course hardware, PX4 is the flight controller supported in RoboticsAcademy, both on real drones and
drones and in the Gazebo simulator.
in the Gazebo simulator.
3.2. Middleware Layer
3.2. Middleware Layer
ROS [47] is the adopted middleware in RoboticsAcademy. ROS is the de facto standard middleware,
withROS
the [47] is community
largest the adoptedofmiddleware in RoboticsAcademy.
users in the robotics ROS
field. It provides is the de factocollection
a comprehensive standardof
middleware, with the largest community of users in the robotics field. It provides a comprehensive
drivers for a wide variety of robots, helping to shorten the programming development time and increase
collection of drivers
the robustness for aapplications
of final wide varietydue
of robots,
to codehelping
reuse. to shorten the
It supports manyprogramming
programming development
languages,
time and increase the robustness of final applications due to code
including Python. The current version used in RoboticsAcademy is ROS-Melodic. reuse. It supports many
programming languages,
Concerning including
the middleware Python.
needed Thepresented
in the current version
course,used in RoboticsAcademy
the MAVLink protocol [48]isconnects
ROS-
Melodic.
the student’s application with the drone flight controller. ROS supports MAVLink through its
Concerning the middleware needed in the presented course, the MAVLink protocol [48]
connects the student’s application with the drone flight controller. ROS supports MAVLink through
its MAVROS package. For onboard drone cameras’ support, regular ROS image drivers are used. In
RoboticsAcademy, MAVROS has been extended to support speed regulation of the flying control. In
this way, the drone programming interface in RoboticsAcademy provides both the high-level classic
Electronics 2020, 9, 2163 7 of 18
MAVROS package. For onboard drone cameras’ support, regular ROS image drivers are used.
In RoboticsAcademy, MAVROS has been extended to support speed regulation of the flying control.
Electronics
In this way,2020,
the9, xdrone
FOR PEER REVIEW
programming 7 of 18
interface in RoboticsAcademy provides both the high-level classic
waypoint missions and the mid-level speed commands to rule drone movements. Speed commands
allow more
allow more fine-grained
fine-grained control
control of
of the
the drone,
drone, such
such as
as the
the one
one required
required in
in the
the flying
flying robots’
robots’ visual
visual
control applications.
control applications.
3.3. Application
3.3. Application Layer
Layer
The application
The application layer
layer comprises
comprises the
the student’s
student’s code
code and
and the
the exercise
exercise template,
template, which
which acts
acts as
as aa
container of the student’s code and provides the following services:
container of the student’s code and provides the following services:
• Hardware abstraction
Hardware abstraction layer
layer (HAL)
(HAL) API:API: simple
simple API
API to
to obtain
obtain processed
processed data
data from
from robot
robot sensors
sensors
and command
and command robot
robot actuators
actuators at
at aa high
high level
level
Graphical user interface (GUI) API: used for student’s code checking and debugging, shows
• Graphical user interface (GUI) API: used for student’s code checking and debugging, shows sensory
sensory data or partial processing in a visual way (for example, the images from the robot
data or partial processing in a visual way (for example, the images from the robot camera or the
camera or the results of applying color filtering)
results of applying color filtering)
Timing skeleton for implementing the reactive robot behavior: a continuous loop of iterations,
• Timing skeleton for implementing the reactive robot behavior: a continuous loop of iterations,
executed several times per second. Each iteration performs four steps: collecting sensory data,
executed several times per second. Each iteration performs four steps: collecting sensory data,
processing them, deciding actions, and sending orders to the actuators
processing them, deciding actions, and sending orders to the actuators
The application layer is implemented as a ROS node. The template provided in each node
The application layer is implemented as a ROS node. The template provided in each node supports
supports access to the robot’s sensors and actuators using the appropriate ROS methods and
access to the robot’s sensors and actuators using the appropriate ROS methods and communication
communication mechanisms (topics, subscriptions, publishing...). The hardware complexity and
mechanisms (topics, subscriptions, publishing . . . ). The hardware complexity and details are hidden
details are hidden to grant a simple local API in Python for the student (HAL API), employing ROSpy
to grant a simple local API in Python for the student (HAL API), employing ROSpy (a pure Python
(a pure Python client library for ROS). Figure 3 shows a typical execution of an exercise using a
client library for ROS). Figure 3 shows a typical execution of an exercise using a simulated robot.
simulated robot.
Figure
Figure 3.
3. Typical
Typical execution
execution of
of an exercise of
an exercise of RoboticsAcademy
RoboticsAcademy using
using Gazebo
Gazebo viewer.
viewer.
The drone HAL API used in all the exercises of the presented course includes these methods for
measurements, images,
getting sensor measurements, images, and
and drone
drone state:
state:
• drone. get_position
drone. get_position ():
(): returns
returns the
the actual
actual position
position of
of the
the drone
drone as
as aa 11 ×
× 33 NumPy
NumPy array
array [x,
[x, y,
y, z],
z],
in meters.
in meters.
• drone. get_velocity
drone. get_velocity ():
(): returns
returns the
the actual
actual velocities
velocities of
of the
the drone
drone as
as aa 11 ×
× 33 NumPy
NumPy array
array [vx,
[vx, vy,
vy,
vz], in
vz], in m/s.
m/s.
drone. get_yaw_rate (): returns the actual yaw rate of the drone, in rad/s.
drone. get_orientation (): returns the actual roll, pitch, and yaw of the drone as a 1 × 3 NumPy
array [roll, pitch, yaw], in rad.
drone. get frontal_image (): returns the latest images from the frontal camera as an OpenCV
image (cv2_image).
Electronics 2020, 9, 2163 8 of 18
• drone. get_yaw_rate (): returns the actual yaw rate of the drone, in rad/s.
• drone. get_orientation (): returns the actual roll, pitch, and yaw of the drone as a 1 × 3 NumPy
array [roll, pitch, yaw], in rad.
• drone. get frontal_image (): returns the latest images from the frontal camera as an OpenCV
image (cv2_image).
• drone. get_ventral_image (): returns the latest images from the ventral camera as an OpenCV
image (cv2_image).
• drone. get_landed_state (): returns one if the drone is on the ground (landed), two if it is in the air,
and four if landing.
Additionally, the HAL API provides these methods for drone control:
• drone.set_cmd_vel (vx, vy, vz, yaw_rate): commands the linear velocity of the drone in the x, y,
and z directions (in m/s) and the yaw rate (rad/s) in its body-fixed frame.
• drone.set_cmd_mix (vx, vy, z, yaw_rate): commands the linear velocity of the drone in the x,
y directions (in m/s), the height (z) related to the takeoff point, and the yaw rate (in rad/s).
• drone.takeoff (height): takes off from the current location to the given elevation (in meters).
• drone. land (): lands at the current location.
RoboticsAcademy native release is an open-source platform, free and simple to use. All of its
underlying infrastructure (Python, Gazebo, ROS) is provided as regular and standard official Debian
packages, and so they can be installed easily on Ubuntu Linux computers. Additionally, the required
underlying resources, which are not contained in traditional Gazebo or ROS packages, are created and
stored in several Debian packages from the JdeRobot organization.
The native release runs on Ubuntu Linux machines. Docker images have been created and are
maintained for Windows and macOS users so that they can use the platform through a virtual machine
(Docker container) in their computers.
• Unit 1. Introduction to aerial robotics: types of UAVs, real drone applications such as military,
logistics, visual inspection.
• Unit 2. Drone sensors and perception: inertial measurement units (IMUs), compass, GPS, LIDAR,
cameras, elementary image processing.
• Unit 3. Flight physics and basic control: 3D geometry, quadrotor physics, basic movements,
hovering, forward motion, rotation, stabilization.
Electronics 2020, 9, 2163 9 of 18
Drones
DR1 Navigation by position
DR2 Following an object on the ground
DR3 Following a road with visual control
DR4 Cat and mouse
DR5 Escape from a maze following visual clues
DR6 Searching for people to rescue within a perimeter
DR7 Escape from a hangar with moving obstacles
DR8 Land on a beacon
Computer Vision
CV1 Color filter for object detection
CV2 Visual odometry for self-localization
Figure 5.
Figure 5. “Following an object on the ground” exercise.
exercise.
“Following an object on the ground”
is typically implemented as a finite state machine, with several states such as go-to-the-
Figure
Figure 7.
7. “Escape
“Escape from
from aa or
perimeter, explore-inside-the-perimeter, maze using
using visual
visual clues”
go-back-home.
maze clues” exercise.
exercise.
Figure 8.
Figure 8. “Searching
“Searching for
for people
people to
to rescue
rescue within
within aa perimeter”
perimeter” exercise.
exercise.
and giving rewards or badges, are implemented in the classroom to make the learning process more
attractive to the student [49].
It is already known that gamified learning in higher education (and in engineering degrees
in particular) enhances students’ engagement and motivation and improves their attendance ratios
and grades [50,51]. Positive reinforcement in both on-site and distance classrooms is promoted
when managing the student load using challenges of progressive difficulty and gamified step-by-step
tasks [52]. Additionally, gamification provides instant student feedback and grading, proven to help
formative self-assessment and auto evaluation [53].
Through stepwise updates, RoboticsAcademy has been redesigned to support gamified learning
in computer vision and robotics, including:
• Competitive exercises. Several exercises in our present collection are based on completing a task
in a given time while competing against the fastest or more complete solution, increasing the
student’s engagement.
• Social interactions between students, teachers, and developers promoted utilizing a dedicated
RoboticsAcademy (https://developers.unibotics.org/) web forum and several accounts across
prime
Electronicssocial
2020, 9,media platforms
x FOR PEER REVIEW since 2015, such as a video channel on YouTube (https://www.
13 of 18
The organizing committee programmed several different autonomous mice drones to allow for
three qualifying sessions, starting from more straightforward movement behaviors, and transitioning
to more complex behaviors to track mice. Some of them were released for training. An automatic
assessment tool (referee) was also developed to score each 2 min round systematically. The referee
measured the instantaneous distance from each cat drone to the mouse, showing its evolution (along
with the whole game) on the screen. When it was below a certain threshold (close-threshold), both the
progress line and the background were painted green, and when it was above in red. As long as the cat
drone was near the mouse, the score increased. It was forbidden to touch the mouse, with each contact
penalized in the final round score.
In the second challenge, named escape from the hangar, a single mobile aerial robot had to
take off inside the hangar and find its way out in less than 1 min and avoiding the collision with six
inner walls, which were moving in a random pattern to increase the difficulty (see Figure 12). Again,
an automatic referee was built to measure how many seconds each aerial robot takes to leave the
Electronics 2020, 9, x FOR PEER REVIEW 14 of 18
hangar. The initial score started from 60 points and decreased during the time each robot was inside the
hangar. It was forbidden to touch the walls, and each contact further reduced the score. Additionally,
hangar. It was forbidden to touch the walls, and each contact further reduced the score. Additionally,
the drone’s starting point was changed during the competition, so a pure position-based control was
the drone’s starting point was changed during the competition, so a pure position-based control was
not a proper strategy.
not a proper strategy.
The video
The video of
of the
the final
final round
round in in the
the IROS2018-Program-A-Robot
IROS2018-Program-A-Robot competition competition is is available
available online
online
(https://www.youtube.com/watch?v=_0ZkciOHnmU). It shows the performance
(https://www.youtube.com/watch?v=_0ZkciOHnmU). It shows the performance of the uploaded of the uploaded
solutions for
solutions for the
thebest
bestthree
threeparticipants
participantsinineach
eachchallenge,
challenge, together
together with
with their
their automatic
automatic evaluations
evaluations for
for each 2 min rounds. In summary, it was an excellent acid test for
each 2 min rounds. In summary, it was an excellent acid test for the capability of RoboticsAcademy as the capability of
RoboticsAcademy as an online platform for holding, among others, distance competitions
an online platform for holding, among others, distance competitions in robotics and computer vision. in robotics
and computer
More thanvision.
40 students participated in the competitions (Figure 13). The feedback from them was
More than
very positive 40 students
in the participated
three editions, in the competitions
which encouraged (Figure
us to scale up in13).
scopeThe feedback
from a localfrom them was
competition to
very positive in the three editions, which encouraged us to scale up in scope from
a national one, and finally to an international one. For instance, “Thank you for organizing this dronea local competition
to a national one,
programming and finally
competition; totype
this an international
of enterpriseone. For instance,
is really cool” from “Thank you for organizing
one participant this
from the 2017
drone programming competition; this type of enterprise is really cool” from one
edition is an illustrative example. Another participant of IROS2018 edition even kept engaged with participant from the
2017 edition is an illustrative
RoboticsAcademy example. Another
after the competition participant
and developed a newof IROS2018
exercise for edition
it, theeven
“Visualkept engaged
Odometry
with RoboticsAcademy after the competition and developed a new exercise
with RGBD Camera” exercise (https://jderobot.github.io/RoboticsAcademy/exercises/ComputerVision/ for it, the “Visual
Odometry with RGBD
visual_odometry) as an open-source contribution to the project. Camera” exercise
(https://jderobot.github.io/RoboticsAcademy/exercises/ComputerVision/visual_odometry) as an
open-source contribution to the project.
drone programming competition; this type of enterprise is really cool” from one participant from the
2017 edition is an illustrative example. Another participant of IROS2018 edition even kept engaged
with RoboticsAcademy after the competition and developed a new exercise for it, the “Visual
Odometry with RGBD Camera” exercise
(https://jderobot.github.io/RoboticsAcademy/exercises/ComputerVision/visual_odometry)
Electronics 2020, 9, 2163 as15 ofan
18
open-source contribution to the project.
• Each exercise is divided into three layers: hardware, middleware, and application. This internal
design makes it easier to run the same student’s code in physical and simulated robots with
only minor configuration changes. The open-source Gazebo simulator, widely recognized in the
robotics community, has been used for 3D simulations of all robots, sensors, and environments.
• The middleware layer is ROS-based, the de facto standard in service robotics that supports many
programming languages, including Python. Previously, RoboticsAcademy was separated from
this widely used middleware, needing frequent maintenance, and having little scalability and a
short number of users (only those who already had a thorough background with the tool).
• The application layer contains the student’s code. It includes a hardware abstraction layer (HAL)
API to grant students high-level access to the robot sensors and actuators, and a graphical user
interface (GUI) API to show sensor data, process images, or debug code.
• RoboticsAcademy runs natively on Linux Ubuntu machines and can be installed using official
Debian packages. Docker containers are used to allow easy installation in Windows and macOS
based computers, making RoboticsAcademy particularly suited to the distance learning approach.
An open-access distance full course covering diverse drone programming aspects has been
proposed, aimed at engineering students. Its syllabus comprises six academic units, comprising
from sensors and actuators to navigation and computer vision. The course includes practical content,
all covered with RoboticsAcademy exercises. It was successfully tested and validated with students of
several engineering degrees at the Rey Juan Carlos University over five academic years. In this course,
any student could learn and practice from home, anytime. End-of-course surveys showed that using
the platform was helpful, productive, and motivating for them.
RoboticsAcademy also includes additional functionalities as automatic assessment tools and
competitive goals, making it a perfectly adapted tool for gamified learning strategies. It has been proven
as a convenient platform for organizing online robotics competitions as well. The Program-A-Robot
annual championship was born in 2016. It has already celebrated its third edition using
RoboticsAcademy, proposing competitive challenges about programming the autonomous intelligence
of different vision-assisted mobile robots.
Electronics 2020, 9, 2163 16 of 18
Future work lines for enhancing RoboticsAcademy include developing more exercises,
expanding the present full course, and proposing different specific learning paths to reach a broader
range of educational stages. Furthermore, an in-depth experimental analysis is planned to provide
quantitative evidence about user experience, creating a complete questionnaire, and collecting
information from other students. It will provide a more solid methodological basis.
Author Contributions: Conceptualization, J.M.C. and D.M.-M.; formal analysis, J.M.C.; funding acquisition,
J.M.C.; investigation, P.A. and D.R.-Á.; methodology, P.A.; project administration, J.M.C. and J.V.; resources, J.V.;
software, D.M.-M., P.A., D.R.-Á. and J.F.-C.; supervision, J.M.C.; validation, D.M.-M. and L.G.-P.; visualization,
L.G.-P.; writing—original draft, J.V., L.G.-P. and J.F.-C.; writing—review and editing, J.M.C. and J.F.-C. All authors
have read and agreed to the published version of the manuscript.
Funding: This research was partially funded by the Community of Madrid in the framework of two research
projects: (1) Multiannual agreement with Rey Juan Carlos University in line of action 1, “Encouragement of Young
Ph.D. Students Investigation” project ref. F664, acronym UNIBOTICS2.0. (2) RoboCity2030-DIH-CM (2019-2022):
RoboCity2030—Madrid Robotics Digital Innovation Hub, Programa de Actividades de I+D entre Grupos de
investigación de la Comunidad de Madrid en Tecnologías 2018. Project ref. S2018/NMT-4331.
Acknowledgments: The authors thank Google for funding the JdeRobot non-profit organization in its calls for
Google Summer of Code 2015, 2017, 2018, 2019, and 2020.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Aliane, N. Teaching fundamentals of robotics to computer scientists. Comput. Appl. Eng. Educ. 2011,
19, 615–620. [CrossRef]
2. Mateo, T.; Andujar, J. Simulation tool for teaching and learning 3d kinematics workspaces of serial robotic
arms with up to 5-DOF. Comput. Appl. Eng. Educ. 2012, 20, 750–761. [CrossRef]
3. Mateo, T.; Andujar, J. 3D-RAS: A new educational simulation tool for kinematics analysis of anthropomorphic
robotic arms. Int. J. Eng. Educ. 2011, 27, 225–237.
4. Lopez-Nicolas, G.; Romeo, A.; Guerrero, J. Simulation tools for active learning in robot control and
programming. In Proceedings of the 20th EAEEIE Annual Conference, Valencia, Spain, 22–24 June 2009;
Innovation in Education for Electrical and Information Engineering: New York, NY, USA, 2009.
5. Lopez-Nicolas, G.; Romeo, A.; Guerrero, J. Active learning in robotics based on simulation tools. Comput. Appl.
Eng. Educ. 2014, 22, 509–515. [CrossRef]
6. Jara, C.; Candelas, F.; Pomares, J.; Torres, F. Java software platform for the development of advanced robotic
virtual laboratories. Comput. Appl. Eng. Educ. 2013, 21, 14–30. [CrossRef]
7. Gil, A.; Reinoso, O.; Marin, J.; Paya, L.; Ruiz, J. Development and deployment of a new robotics toolbox for
education. Comput. Appl. Eng. Educ. 2015, 23, 443–454. [CrossRef]
8. Fabregas, E.; Farias, G.; Dormido-Canto, S.; Guinaldo, M.; Sanchez, J.; Dormido, S. Platform for teaching
mobile robotics. J. Intell. Robot. Syst. 2016, 81, 131–143. [CrossRef]
9. Detry, R.; Corke, P.; Freese, M. TRS: An Open-Source Recipe for Teaching/Learning Robotics with a Simulator.
2014. Available online: http://ulgrobotics.github.io/trs (accessed on 16 December 2020).
10. Guzman, J.; Berenguel, M.; Rodriguez, F.; Dormido, S. An interactive tool for mobile robot motion planning.
Robot. Auton. Syst. 2008, 56, 396–409. [CrossRef]
11. Guyot, L.; Heiniger, N.; Michel, O.; Rohrer, F. Teaching robotics with an open curriculum based on the
e-puck robot, simulations and competitions. In Proceedings of the 2nd International Conference on
Robotics in Education (RiE 2011), Vienna, Austria, 15–16 September 2011; Stelzer, R., Jafarmadar, K., Eds.;
Ghent University: Ghent, Belgium, 2011; pp. 53–58.
12. Soto, A.; Espinace, P.; Mitnik, R. A mobile robotics course for undergraduate students in computer science.
In Proceedings of the 2006 IEEE 3rd Latin American Robotics Symposium (LARS’06), Santiago, Chile,
26–27 October 2006; IEEE: New York, NY, USA, 2007; pp. 187–192.
13. Thrun, S. Teaching challenge. IEEE Robot. Autom. Mag. 2006, 13, 12–14. [CrossRef]
14. Jara, C.A.; Candelas, F.A.; Puente, S.; Torres, F. Hands-on experiences of undergraduate students in automatics
and robotics using a virtual and remote laboratory. Comput. Educ. 2011, 57, 2451–2461. [CrossRef]
Electronics 2020, 9, 2163 17 of 18
15. Cliburn, D.C. Experiences with the LEGO Mindstorms throughout the Undergraduate Computer Science Curriculum.
In Frontiers in Education, Proceedings of the 36th Annual Conference, San Diego, CA, USA, 27–31 October 2006; IEEE:
New York, NY, USA, 2007; pp. 1–6. [CrossRef]
16. Gomez-de-Gabriel, J.M.; Mandow, A.; Fernandez-Lozano, J.; García-Cerezo, A.J. Using LEGO NXT Mobile
Robots with LabVIEW for Undergraduate Courses on Mechatronics. IEEE Trans. Educ. 2011, 54, 41–47. [CrossRef]
17. Cuéllar, M.; Pegalajar Jiménez, M. Design and Implementation of Intelligent Systems with LEGO Mindstorms
for Undergraduate Computer Engineers. Comput. Appl. Eng. Educ. 2014, 22, 153–166. [CrossRef]
18. Montés, N.; Rosillo, N.; Mora, M.C.; Hilario, L. Real-Time Matlab-Simulink-Lego EV3 Framework for
Teaching Robotics Subjects. In Proceedings of the International Conference on Robotics and Education RiE 2017;
Springer: Cham, Switzerland, 2018. [CrossRef]
19. Gonzalez-Garcia, S.; Rodríguez, J.; Loreto, G.; Montaño Serrano, V. Teaching forward kinematics in a
robotics course using simulations: Transfer to a real-world context using LEGO mindstorms™. Int. J. Interact.
Des. Manuf. 2020, 14. [CrossRef]
20. Zhang, M.; Wan, Y. Improving Learning Experiences Using LEGO Mindstorms EV3 Robots in Control
Systems Course. Int. J. Electr. Eng. Educ. 2020. [CrossRef]
21. Esposito, J.M. The state of robotics education: Proposed goals for positively transforming robotics education
at postsecondary institutions. IEEE Robot. Autom. Mag. 2017, 24, 157–164. [CrossRef]
22. Corke, P.; Greener, E.; Philip, R. An Innovative Educational Change: Massive Open Online Courses in
Robotics and Robotic Vision. IEEE Robot. Autom. Mag. 2016, 23, 81–89. [CrossRef]
23. Artificial Intelligence for Robotics. Available online: https://www.udacity.com/course/artificial-intelligence-
for-robotics--cs373 (accessed on 5 November 2020).
24. Autonomous Mobile Robots. Available online: https://www.edx.org/course/autonomous-mobile-robots-
ethx-amrx-1 (accessed on 5 November 2020).
25. Pozzi, M.; Malvezzi, M.; Prattichizzo, D. Mooc on the art of grasping and manipulation in robotics:
Design choices and lessons learned. In Proceedings of the International Conference on Robotics and
Education RiE 2017, Sofia, Bulgaria, 26–28 April 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 71–78.
26. Kulich, M.; Chudoba, J.; Kosnar, K.; Krajnik, T.; Faigl, J.; Preucil, L. Syrotek-distance teaching of mobile
robotics. IEEE Trans. Educ. 2013, 56, 18–23. [CrossRef]
27. Zalewski, J.; Gonzalez, F. Evolution in the Education of Software Engineers: Online Course on Cyberphysical
Systems with Remote Access to Robotic Devices. Int. J. Online Eng. 2017, 13, 133–146. [CrossRef]
28. Téllez, R.; Ezquerro, A.; Rodríguez, M.Á. ROS in 5 Days: Entirely Practical Robot Operating System Training;
Independently Published: Madrid, Spain, 2016.
29. Casañ, G.A.; Cervera, E.; Moughlbay, A.A.; Alemany, J.; Martinet, P. ROS-based online robot programming
for remote education and training. In Proceedings of the 2015 IEEE International Conference on Robotics
and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; IEEE: New York, NY, USA, 2015; pp. 6101–6106.
30. Cervera, E.; Martinet, P.; Marin, R.; Moughlbay, A.A.; Del Pobil, A.P.; Alemany, J.R.; Casañ, G. The robot
programming network. J. Intell. Robot. Syst. 2016, 81, 77–95. [CrossRef]
31. Casañ, G.; Cervera, E. The Experience of the Robot Programming Network Initiative. J. Robot. 2018,
2018, 2312984. [CrossRef]
32. Cervera, E.; Del Pobil, A.P. Roslab: Sharing ROS Code Interactively with Docker and Jupyterlab. IEEE Robot.
Autom. Mag. 2019, 26, 64–69. [CrossRef]
33. Liu, Y.; Xu, Y. Summary of cloud robot research. In Proceedings of the 2019 25th International Conference on
Automation and Computing (ICAC), Lancaster, UK, 5–7 September 2019; IEEE: New York, NY, USA, 2019;
pp. 1–5.
34. Autonomous Navigation for Flying Robots. Available online: https://www.edx.org/course/autonomous-
navigation-for-flying-robots (accessed on 5 November 2020).
35. Engel, J.; Sturm, J.; Cremers, D. Scale-aware navigation of a low-cost quadrocopter with a monocular camera.
Robot. Auton. Syst. 2014, 62, 1646–1656. [CrossRef]
36. Engel, J.; Sturm, J.; Cremers, D. Camera-based navigation of a low-cost quadrocopter. In Proceedings of the
2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Algarve, Portugal, 7–12 October
2012; IEEE: New York, NY, USA, 2012; pp. 2815–2821.
37. Learn the Importance of Autonomous Systems and Drone Technologies. Available online: https://www.edx.
org/professional-certificate/umgc-usmx-drones-and-autonomous-systems (accessed on 5 November 2020).
Electronics 2020, 9, 2163 18 of 18
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional
affiliations.
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).