Free As A Bird Event-Based Dynamic Sense-And-Avoid For Ornithopter Robot Flight
Free As A Bird Event-Based Dynamic Sense-And-Avoid For Ornithopter Robot Flight
2377-3766 © 2022 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See https://www.ieee.org/publications/rights/index.html for more information.
horized licensed use limited to: AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING. Downloaded on August 30,2022 at 10:11:52 UTC from IEEE Xplore. Restrictions ap
5414 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 7, NO. 2, APRIL 2022
horized licensed use limited to: AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING. Downloaded on August 30,2022 at 10:11:52 UTC from IEEE Xplore. Restrictions ap
RODRÍGUEZ-GÓMEZ et al.: FREE AS A BIRD: EVENT-BASED DYNAMIC SENSE-AND-AVOID 5415
horized licensed use limited to: AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING. Downloaded on August 30,2022 at 10:11:52 UTC from IEEE Xplore. Restrictions ap
5416 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 7, NO. 2, APRIL 2022
Fig. 3. Block diagram of the Dynamic Object Motion Estimation method: (a) Time Filter detects events triggered by moving objects; (b) Corner Detector finds
relevant features; (c) Optical Flow estimates the motion of events in the image plane; and (d) Clustering estimates the mean object flow. For clearer visualization
the results are shown in event images accumulating events every 10 ms.
Fig. 4. Event-based dynamic object detection: right) frame from the APS
sensor of the DAVIS 346; left) event image of events accumulated during 10 ms
(in black) including events detected to belong to the moving object (magenta).
η 1
v: = v− v† , (3)
performs event-by-event processing by computing the flow of η−1 η−1
each incoming event. Our method integrates an adapted version
of ABMOF to reduce the onboard processing. First, event slices where η is the number of assigned tuples to the cluster (i.e., the
are fed only with events triggered from moving objects. This length of Φ). Similarly, the influence of old samples is removed
reduces the computational load by processing only <32% of the from v, see (3), where v† corresponds to flow samples with
event stream in the experiments of Section V. Second, the optical timestamps lower than τc . The parameter τc defines the lifetime
flow is computed only from events considered as corners. The of samples in the cluster. It is initially set to 5 ms and dynam-
*eFast event Corner Detector in [29] is selected for this task ically adjusted using the feedback reference from [28] in the
given its fast response and low False Positive Rate. Computing experiments of Section V. Finally, ASAP prevents processing
optical flow only from corners reduces the computational cost overflows by dynamically adapting event packaging to keep the
in >25% while providing a stable optical flow estimation as responsiveness of the method.
described in Section V.
Finally, the resulting event optical flow estimations are clus- B. Collision Risk Evaluation Strategy
tered to obtain an approximation of the object’s optical flow.
For this task, we used an adapted version of the event-by-event A reactive evasive maneuver strategy is used to prevent
clustering algorithm described in [30]. This algorithm clusters collision risk situations with incoming obstacles. The possible
events with spatio-temporal continuity within an adaptive time collisions with detected obstacles are determined by considering
window. The algorithm was modified to cluster optical flow the geometry of the robot. The ornithopter is approximated by a
from events. It receives as input the tuple f = (x, ts, v), where 2 W × 2H safety volume and the obstacle is approximated by a
v = (vx , vy ) represents the optical flow estimation of the event sphere of radius R (see Fig. 5). R encloses the obstacle volume
with pixel coordinates x and timestamp ts. Clustering is per- and a Safety Distance b considering small sensing uncertainties,
formed by evaluating the proximity of each new event to a i.e., R = R + b. Thus, the minimum angles ψ and θ to avoid
randomly selected event from each of the previous clusters. Each a possible collision risk without deviating the flight trajectory
cluster is defined by its centroid x, which represents the average are:
location of cluster events, the average optical flow v = (vx , vy ),
W + 2R H + 2R
and the list Φ of previous tuples f assigned to the cluster. Each ψ ∗ (t) = arctan
, θ∗ (t) = arctan , (4)
new valid optical flow estimation updates v as in (2). z(t) − 2R z(t) − 2R
η 1 where z(t) is the obstacle depth w.r.t. the camera. For any pair of
v: = v+ v, (2) angles (ψ, θ) such that |ψ(t1 )| ≤ |ψ ∗ (t1 )| and |θ(t1 )| ≤ |θ∗ (t1 )|
η+1 η+1
horized licensed use limited to: AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING. Downloaded on August 30,2022 at 10:11:52 UTC from IEEE Xplore. Restrictions ap
RODRÍGUEZ-GÓMEZ et al.: FREE AS A BIRD: EVENT-BASED DYNAMIC SENSE-AND-AVOID 5417
horized licensed use limited to: AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING. Downloaded on August 30,2022 at 10:11:52 UTC from IEEE Xplore. Restrictions ap
5418 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 7, NO. 2, APRIL 2022
Fig. 6. Different size objects considered for the experiments: left) small box; Fig. 7. Histogram of the distance between the detected object and the ground
center) stuffed toy; and right) FitBall. truth in the overall tests.
TABLE II
communication overflow. Due to the computational limitations ACCURACY, PRECISION, TRUE POSITIVE RATE (TPR), AND FALSE POSITIVE
RATE (FPR) RESULTS OF DYNAMIC OBSTACLE DETECTION USING THE
of the onboard computer, ASAP was configured to feed our DIFFERENT OBSTACLES WHILE VARYING THE LAUNCHING DISTANCE
algorithm with event packages at a maximum rate of 250 Hz,
without hampering event transmission or producing processing
overflow. Hence, our pipeline was configured to output v at the
same rate as the input packages.
The proposed dynamic obstacle sense-and-avoid system was
experimentally validated in both indoor and outdoor scenarios.
The GRVC Testbed is a closed area of 15 × 21 × 8 m with 24
OptiTrack Primex 13 cameras providing millimeter accuracy
pose estimations. The outdoor scenario is a Soccer Field of events due to the camera resolution, which hindered their 2D
48 × 54 m with surrounding obstacles suitable to perform short representation for obstacle detection. A total of 45 experiments
flight experiments with the ornithopter. For all the experiments, were performed with each obstacle.
the parameters in (1) were set to vL = 3 m/s, vH = 6 m/s, The detection performance was evaluated by comparing the
ωL = 1.3 rad/s, and ωH = 3 rad/s, given by the typical flight outcome of our algorithm with the frames provided by the APS
kinematics conditions of the robot. Further, α was set to 0.8, as sensor. The centroid of the detected moving object was rendered
the robot mainly performs forward motions. Parameter τ sets into event images obtained each 25 ms. The distance between the
the threshold to distinguish the events triggered by dynamic object’s centroid in both frames was used as evaluation criteria.
objects. A large value entails detections at longer distances Each comparison led to a possible result: False Positive (FP),
while permitting events triggered by static objects. Besides, False Negative (FN), True Positive (TP), and True Negative
a small value of τ performs a quite selective filtering at the (TN). A True Positive occurred when the Manhattan distance
expense of reducing the distance at which the obstacles are between the centroid of both objects was ≤η pixels. Fig. 7 shows
detected. Hence, τ defines a trade-off between filtering events a histogram of the distance between the object and the ground
triggered by static objects and the maximum distance to detect of truth in the performed experiments. The majority of samples
an obstacle. From our experiments, τL =15 ms and τH =25 ms at distances >15 px correspond to False Positives. Choosing η
were empirically selected to provide a trade-off between having to 10 was a suitable trade-off between validating the majority
detection distances of 6 m while filtering 82% of events triggered valid samples as True Positives while rejecting False Positive
by the scene background. samples. Next, the overall Accuracy, Precision, True Positive
The experimental validation was divided into two parts. First, Rate (TPR), and False Positive Rate (FPR) were computed.
the dynamic obstacle detection and motion estimation were Table II summarizes the detection results of each obstacle at
evaluated. Second, the evaluation of the full obstacle avoidance different distances. The method reported an overall accuracy of
system was performed in experiments in indoor and outdoor 91.7%. The distance directly affected the detection performance
scenarios with different illumination conditions. specially with small objects at distances >4 m. At such distances,
small objects triggered few events hampering object detection,
and the lack of triggered events affected the obstacle detection as
A. Obstacle Detection and Motion Estimation Evaluation
many events in S(x) did not satisfy the τ condition. In general,
In these experiments, obstacles were thrown in different at distances between 2 to 4 m the method reported an accuracy
directions into the Field of View (FoV) of the camera while above 94.7%. In this range, the size of the different objects in
the ornithopter performed forward flight. The experiments were the image was enough to perform successful dynamic object
performed in the Testbed to retrieve the pose information from detection in the majority of the experiments. Additionally, the
the obstacle and the robot. The goal was to evaluate the detection method provided a low number of False Positives as reported by
and motion estimations performance of the method described in the average FPR of 4.2%, and an average Precision of 95.1%.
Section IV-A. Three objects with different sizes were considered Finally, a few detections were missed during the experiments as
(Fig. 6): a Small Box of size 220 × 200 × 150 mm, a Stuffed Toy evidenced by the average TPR of 88.5%.
of size 400 × 450 × 400 mm, and a Fitball with a diameter of The obstacle detection method was evaluated in multi-
750 mm. Obstacle detection was evaluated in distances ranging obstacle experiments in which three objects were thrown from
from 0.5 m to 6 m. At distances closer than 0.5 m the obstacle different directions in the camera FoV at similar times. It re-
body filled a large zone of the image leading to invalid detections. ported an accuracy of 74.2% among all the experiments. The
At large distances obstacles were represented by few pixel performance reduction was caused by the generation of False
horized licensed use limited to: AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING. Downloaded on August 30,2022 at 10:11:52 UTC from IEEE Xplore. Restrictions ap
RODRÍGUEZ-GÓMEZ et al.: FREE AS A BIRD: EVENT-BASED DYNAMIC SENSE-AND-AVOID 5419
Fig. 8. Box plot (left) and mean and standard deviation (right) of the motion direction error along each of 30 experiments. The direction error is defined as the
instantaneous absolute difference between the direction of motion estimated by our approach and the obstacle direction from the ground truth. The mean error
considering all the experiments is 6.97◦ . The mean standard deviation considering all the experiments is 3.89◦ , and maximum instantaneous direction error is 19.54◦ .
Positive samples when the obstacles overlapped in the image TABLE III
plane, which tended to merge clusters hampering the individual SUCCESS RATE OF THE PROPOSED DYNAMIC OBSTACLE AVOIDANCE METHOD
IN DIFFERENT SCENARIOS
detection of objects. Despite this degradation, the method results
were satisfactory taking into account that the detection was
performed with a single event camera.
The motion estimation evaluation consisted of comparing the
mean optical flow of the dynamic obstacle v with the ground
truth measurements. The ground truth was obtained from the
motion capture system by projecting the position of the obstacle
in the image and estimating its motion direction from previous robot trajectories were estimated assuming the robot maintains
and future samples. For better validation, the obstacles were a forward motion. The experiments were performed indoors: the
launched from a distance of 8 m to describe larger trajectories. ground truth robot and obstacle positions were recorded using
A total of 30 experiments were performed. The error was defined a motion caption system to check if an ISV occurred. A total of
as the instantaneous angle difference between the estimated di- 25 experiments were performed with each object using regular
rection of the obstacle movement and the ground truth direction. (760 lx) and dark (<15 lx) illumination conditions. The average
Fig. 8 shows the quartiles, mean, and standard deviation error avoidance success rate with the three objects, see Table III,
along the object trajectory in each experiment. The resulting was 90.7%. The best result was obtained using the FitBall as
Root Mean Square Error was 11.2◦ , which is a reasonable error to obstacle, which larger size allowed an earlier detection of the
guide the robot in a collision-free direction. The Safety Distance collision risk situation. The experiments with dark illumination
b in Section IV-B was used to consider this error by enlarging the conditions also reported a remarkable success rate: the slight
obstacle geometry to add extra safety to the evasion maneuver. performance degradation was caused by the additional noisy
events produced by the poor lighting.
B. Sense-and-Avoid Evaluation Second, we performed experiments to analyze the system
performance in case ISV situations did not occur. 20 experiments
Next, the full proposed sense-and-avoid system was evalu- were performed using the Small Box obstacle. In 85% of the
ated. Two cases can be distinguished in case the obstacle detec- experiments, the system did not detect collision risk situations.
tion method fails. A False Positive obstacle detection triggers Only in 15% of the cases, it detected collision risk situations
an unnecessary evasion maneuver. A False Negative obstacle –activating an unnecessary evasive maneuver– in flights with no
detection neglects the evaluation of a collision risk situation, ISV situations. This result was mainly caused by the conservative
potentially causing an impact between the robot and the ob- selection of the radius R which enlarged the obstacle size to
stacle. Different sets of experiments were analysed to evaluate reduce impacts with the robot body. False Positive detections
its performance. The ornithopter was launched towards a goal increased in the dark lighting experiments due to the higher
zone by an operator and performed forward flight using the level of noisy events.
controller described in [5]. Obstacles were launched in various Finally, the system performance in outdoor experiments was
directions to intercept the ornithopter and evaluate different analyzed to evaluate its robustness to different scenarios. In these
evasive maneuvers. Lightweight obstacles were thrown using experiments, the collisions were evaluated visually due to the
a motorized launcher which sets their initial speed while larger lack of motion capture information. A total of 25 experiments
objects were manually launched. Obstacles were thrown from a were performed with each object under light and dark lighting
distance of 10 m and reached an average speed in the range of conditions. The results are shown in Table III. The proposed
[5, 8] m/s, producing relative speeds up to 11 m/s, see Table I. system had a success rate of 92.0% with regular illumination
Our system checked for collision risk situations and activated conditions, and reported acceptable results with dark lighting
an avoidance maneuver if a collision risk situation was detected. conditions 85.3%.
Three types of analyses were performed.
First, we performed experiments to analyse the performance
of the system in case an Intersection of the Safety Volumes VI. CONCLUSION AND FUTURE WORK
(ISV) occurred. An ISV exists when the robot safety volume and This letter presents the first event-based obstacle avoidance
the sphere of radius R enclosing the object intersect along the system for large-scale flapping-wing robots. The proposed ap-
robot trajectory in case it performs no evasion maneuver. These proach exploits the advantages of event-based vision to detect
horized licensed use limited to: AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING. Downloaded on August 30,2022 at 10:11:52 UTC from IEEE Xplore. Restrictions ap
5420 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 7, NO. 2, APRIL 2022
dynamic obstacles and perform evasion maneuvers. Our scheme [14] Z. Ma, C. Wang, Y. Niu, X. Wang, and L. Shen, “A saliency-based
has been validated in several indoor and outdoor scenarios with reinforcement learning approach for a UAV to avoid flying obstacles,”
different illumination conditions. It reports an average avoidance Robot. Auton. Syst., vol. 100, pp. 108–118, 2018.
[15] F. G. Bermudez and R. Fearing, “Optical flow on a flapping wing robot,”
success rate of 89.7% evading dynamic obstacles of different in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst., 2009, pp. 5027–5032.
sizes and shapes. Further, its event-by-event processing nature [16] G. de Croon, E. de Weerdt, C. de Wagter, and B. Remes, “The appearance
and efficient implementation allow fast onboard computation variation cue for obstacle avoidance,” in Proc. IEEE Int. Conf. Robot.
even in low processing capacity hardware, providing high rate Biomimetics, 2010, pp. 1606–1611.
[17] S. Tijmons, G. C. H. E. de Croon, B. D. W. Remes, C. De Wagter, and
estimations (250 Hz in our experiments). Future work includes M. Mulder, “Obstacle avoidance strategy using onboard stereo vision on
the validation of our method in other agile robots. Further, the de- a flapping wing MAV,” IEEE Trans. Robot., vol. 33, no. 4, pp. 858–874,
velopment of a map-based method for the avoidance of static ob- Aug. 2017.
stacles and its integration in a complete obstacle avoidance sys- [18] B. J. P. Hordijk, K. Y. Scheper, and G. C. De Croon, “Vertical landing for
micro air vehicles using event-based optical flow,” J. Field. Robot., vol. 35,
tem for static and dynamic objects are object of future research.
no. 1, pp. 69–90, 2018.
[19] A. G. Eguíluz, J. P. Rodríguez-Gómez, J. R. Martínez-de Dios, and
ACKNOWLEDGMENT A. Ollero, “Asynchronous event-based line tracking for time-to-contact
manuevers in UAS,” in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst.,
The authors would like to thank Jesus Tormo for his help with 2020, pp. 5978–5985.
the robot electronics, and Angela Romero and Francisca Lobera [20] J. Delmerico, T. Cieslewski, H. Rebecq, M. Faessler, and D. Scara-
muzza, “Are we ready for autonomous drone racing? The UZH-FPV
Corsetti for their support on the validation experiments. drone racing dataset,” in Proc. IEEE Int. Conf. Robot. Automat., 2019,
pp. 6713–6719.
REFERENCES [21] S. Sun, G. Cioffi, C. de Visser, and D. Scaramuzza, “Autonomous
quadrotor flight despite rotor failure with onboard vision sensors: Frames
[1] R. Zufferey et al., “Design of the high-payload flapping wing robot E-flap,” vs. events,” IEEE Robot. Automat. Lett., vol. 6, no. 2, pp. 580–587,
IEEE Robot. Automat. Lett., vol. 6, no. 2, pp. 3097–3104, Apr. 2021. Apr. 2021.
[2] G. de Croon, “Flapping wing drones show off their skills,” Sci. Robot., [22] J. P. Rodríguez-Gómez, A. G. Eguíluz, J. R. Martínez-De Dios,
vol. 5, pp. 1–2, 2020. and A. Ollero, “Auto-tuned event-based perception scheme for intru-
[3] N. Haider, A. Shahzad, M. N. M. Qadri, and S. I. Ali Shah, sion monitoring with UAS,” IEEE Access, vol. 9, pp. 44840–44854,
“Recent progress in flapping wings for micro aerial vehicle ap- 2021.
plications,” Proc. Inst. Mech. Eng., Part C: J. Mech. Eng. Sci, [23] J. Rodríguez-Gómez et al., “The GRIFFIN perception dataset: Bridging
vol. 2, 2020, pp. 245–264. the gap between flapping-wing flight and robotic perception,” IEEE Robot.
[4] A. G. Eguíluz, J. P. Rodríguez-Gómez, J. Paneque, P. Grau, J. R. Martinez Automat. Lett., vol. 6, no. 2, pp. 1066–1073, Apr. 2021.
De-Dios, and A. Ollero, “Towards flapping wing robot visual perception: [24] A. G. Eguiluz et al., “Why fly blind? Event-based visual guidance for
Opportunities and challenges,” in Proc. IEEE Int. Workshop Res., Educ. ornithopter robot flight,” in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst.,
Develop. UAS, 2019, pp. 335–343. 2021, pp. 1935–1942.
[5] F. Maldonado, J. Acosta, J. Tormo-Barbero, P. Grau, M. Guzm, and [25] Z. Dengsheng and L. Guojun, “Segmentation of moving objects in image
A. Ollero, “Adaptive nonlinear control for perching of a bioinspired sequence: A review,” Circuits, Syst., Signal Process., vol. 20, no. 2,
ornithopter,” in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst., 2020, pp. 143–183, 2001.
pp. 1385–1390. [26] R. Tapia, A. G. Eguíluz, J. Martínez-De Dios, and A. Ollero, “ASAP:
[6] E. F. Helbling and R. J. Wood, “A review of propulsion, power, and control Adaptive scheme for asynchronous processing of event-based vision al-
architectures for insect-scale flapping-wing vehicles,” Appl. Mech. Rev., gorithms,” in Proc. IEEE Int. Conf. Robot. Automat. Workshop Unconven-
vol. 70, no. 1, pp. 1–9, 2018. tional Sensors Robot., 2020.
[7] G. Gallego et al., “Event-based vision: A survey,” IEEE Trans. Pattern [27] A. Mitrokhin, C. Fermüller, C. Parameshwara, and Y. Aloimonos, “Event-
Anal. Mach. Intell., vol. 44, no. 1, pp. 154–180, Jan. 2022. based moving object detection and tracking,” in Proc. IEEE/RSJ Int. Conf.
[8] E. Mueggler, N. Baumli, F. Fontana, and D. Scaramuzza, “Towards evasive Intell. Robot. Syst., 2018, pp. 6895–6902.
maneuvers with quadrotors using dynamic vision sensors,” in Proc. Eur. [28] M. Liu and T. Delbruck, “Adaptive time-slice block-matching optical flow
Conf. Mobile Robots, 2015, pp. 1–8. algorithm for dynamic vision sensors,” in Proc. Brit. Mach. Vis. Conf.,
[9] N. J. Sanket et al., “Evdodgenet: Deep dynamic obstacle dodging with Sep. 2018, pp. 1–12.
event cameras,” in Proc. IEEE Int. Conf. Robot. Automat., 2020, pp. 10651– [29] E. Mueggler, C. Bartolozzi, and D. Scaramuzza, “Fast event-based corner
10657. detection,” in Proc. Brit. Mach. Vis. Conf., 2017, pp. 1–11.
[10] D. Falanga, K. Kleber, and D. Scaramuzza, “Dynamic obstacle avoid- [30] J. P. Rodríguez-Gómez, A. G. Eguíluz, J. R. Martínez-De Dios, and A.
ance for quadrotors with event cameras,” Sci. Robot., vol. 5, no. 40, Ollero, “Asynchronous event-based clustering and tracking for intrusion
pp. 1–14, 2020. monitoring in UAS,” in Proc. IEEE Int. Conf. Robot. Automat., 2020,
[11] S. Hrabar, “Reactive obstacle avoidance for rotorcraft UAVs,” in Proc. pp. 8518–8524.
IEEE/RSJ Int. Conf. Intell. Robot. Syst., 2011, pp. 4967–4974. [31] S. Armanini, C. De Visser, G. Croon, and M. Mulder, “Time-varying model
[12] H. Oleynikova, D. Honegger, and M. Pollefeys, “Reactive avoidance using identification of flapping-wing vehicle dynamics using flight data,” J. Guid.
embedded stereo vision for MAV flight,” in Proc. IEEE Int. Conf. Robot. Control Dyn., vol. 39, pp. 526–541, Mar. 2016.
Automat., 2015 pp. 50–56. [32] M. Guzmán et al., “Design and comparison of tails for bird-scale flapping-
[13] D. Falanga, S. Kim, and D. Scaramuzza, “How fast is too fast? The role of wing robots,” in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst., 2021,
perception latency in high-speed sense and avoid,” IEEE Robot. Automat. pp. 6335–6342.
Lett., vol. 4, no. 2, pp. 1884–1891, Apr. 2019.
horized licensed use limited to: AMRITA VISHWA VIDYAPEETHAM AMRITA SCHOOL OF ENGINEERING. Downloaded on August 30,2022 at 10:11:52 UTC from IEEE Xplore. Restrictions ap