Fully onboard low-power localization with semantic sensor fusion on a nano-UAV using floor plans

N Zimmerman, H Müller, M Magno… - 2024 IEEE International …, 2024 - ieeexplore.ieee.org
2024 IEEE International Conference on Robotics and Automation (ICRA), 2024ieeexplore.ieee.org
Nano-sized unmanned aerial vehicles (UAVs) are well-fit for indoor applications and for
close proximity to humans. To enable autonomy, the nano-UAV must be able to self-localize
in its operating environment. This is a particularly-challenging task due to the limited sensing
and compute resources on board. This work presents an online and onboard approach for
localization in floor plans annotated with semantic information. Unlike sensor-based maps,
floor plans are readily-available, and do not increase the cost and time of deployment. To …
Nano-sized unmanned aerial vehicles (UAVs) are well-fit for indoor applications and for close proximity to humans. To enable autonomy, the nano-UAV must be able to self-localize in its operating environment. This is a particularly-challenging task due to the limited sensing and compute resources on board. This work presents an online and onboard approach for localization in floor plans annotated with semantic information. Unlike sensor-based maps, floor plans are readily-available, and do not increase the cost and time of deployment. To overcome the difficulty of localizing in sparse maps, the proposed approach fuses geometric information from miniaturized time-of-flight sensors and semantic cues. The semantic information is extracted from images by deploying a state-of-the-art object detection model on a high-performance multi-core microcontroller onboard the drone, consuming only 2.5mJ per frame and executing in 38ms. In our evaluation, we globally localize in a real-world office environment, achieving 90% success rate. We also release an open-source implementation of our work 1 .
ieeexplore.ieee.org
Showing the best result for this search. See all results