Scirobotics - Ado6187 SM
Scirobotics - Ado6187 SM
Methods
Figs. S1 to S13
Table S1
Legends for movies S1 to S4
References (66–75)
Movies S1 to S4
Supplementary Methods
System breakdown of SUPER
This section provides a detailed breakdown of the SUPER system with respect to weights, power
consumption, and CPU usage (Fig. S1). As illustrated in Fig. S1A, the total hardware weight
of SUPER is 1,512 g, comprising the airframe (510 g), battery (432 g), onboard computer (266
g), LiDAR sensor (255 g), and other avionics components (52 g). The power consumption
of SUPER was measured during a typical flight mission in a park (see Fig. S2A), with fully
onboard sensing, planning, and control on and an average speed of 2.84 m/s. In flight mission,
SUPER is commanded to continuously traverse the waypoints following: p1 → p2 → p3 →
p4 → p1 → p5 → p3 → p4 → p1 → p6 → p3 → p4 as shown in Fig. S2B. With a
3300 mAh battery, the drone can perform autonomous navigation among a given sequence of
waypoints until the battery is out. The total measured flight time is approximately 11 minutes
and 24 seconds with a total flight distance of 1,942.56 m (see Fig. S2). The total average power
consumption is around 432 W, with 87.9% (379.7 W) consumed by the actuators. The onboard
computer is the second-largest power consumer, drawing around 43.3 W. The LiDAR sensor
and other avionics components consume 6.3 W and 3.2 W, respectively.
The CPU usage of the onboard computer was also measured during the flight mission. The
average CPU usage is around 13.9%, with a peak of 19.7%. The state estimation module is
the primary consumer, taking up 11.1% on average. The three main software components, state
estimation, planning, and control, are running in parallel as separate processes and communicate
with each other through the publisher-subscriber mechanism of the Robot Operating System
(ROS) (66). To further enhance computational efficiency, the processes within each module are
parallelized using multiple threads. In the state estimation module, the nearest neighbor search
during scan registration is parallelized across 16 threads. In the planning module, the updates to
the spatio-temporal sliding map and the trajectory planning tasks, which include both frontend
path search and backend trajectory optimization, are handled by two separate threads. Lastly,
in the control module, the model predictive controller (MPC) operates on a single thread.
where α ≥ 1 is the expansion factor. We use the same definition to describe a sphere, where
C = diag(r, r, r) and d is the sphere’s center.
The input of the proposed method consists of the obstacle point cloud O ∈ R3×N with N
points, the ellipsoid in the world frame EW , the seed S and the robot’s radius r. The process is
outlined in Algorithm 2 and visualized in Fig. S7C.
At first, the activating obstacle point clouds Oa are initialized with the full obstacle point
clouds O (line 2). Then, we continued to generate hyperplanes to separate Oa until it became
empty. In each iteration, we choose the nearest obstacle sphere, denoted as EW s , to the center
of the ellipsoid E (line 4) and try to find the separating hyperplanes between the EW s and the
ellipsoid. The key to finding these hyperplanes involves locating the planes that intersect the
obstacles and are tangent to uniformly expanded ellipsoid EW α . Building upon the concept
presented in (59), we simplify the problem by transforming it to the ellipsoid frame E and form
it into a single least-distance programming problem, thus avoiding searching over different
expansion factors α. In Eq. S1, we define the ellipsoid in world frame W as the image of
the unit ball in the ellipsoid frame E. Thus, we construct the inverse map and transform the
problem to the ellipsoid frame E as shown in Fig. S7B. The intersecting point pW E
t ’s image pt is
the closest point to the origin of the E frame on EEs . Now, the problem is turned into a minimum
distance optimization problem.
pEt = arg min ∥x∥2 ,
x
(S3)
s.t. x ∈ EEs .
Algorithm 2: Ellipsoid-based Convex Decomposition
Params : Active obstacle point cloud Oa ;
Input : Obstacle point cloud: O; Robot’s radius: r;
Seed: S; Ellipsoid: EW
Output : Polyhedron: P
1 Function GeneratePolytope(O, r, S, E)
2 Oa = O;
3 while not Oa .empty() do
4 EW W
s = FindCloestObstacle(E , Oa );
5 H = ComputeSeparatingPlanes(EW s );
6 foreach s ∈ S do
7 H = CheckAndAdjustPlane(H, s, EW s );
8 end
9 Oa = Oa \ (H ∩ Oa );
10 P.AddPlane(H);
11 end
12 return(P);
13 End Function
This problem can be solved efficiently by finding the root of a 6-th-degree polynomial (71).
We implement it with C++, and on an Intel i7 CPU, the typical solving time is less than 0.5 us.
After finding the tangent point pEt , we can find the half space HE (AE , bE ) = (pEt /∥pEt ∥, −∥pEt ∥)
and the expansion factor α = ∥pEt ∥. Then, we can transform them back to get the HW in the
world frame:
HW (CAE , bE + [CAE ] · d), (S4)
where C, d is the shape matrix and the center of E respectively.
The process above finds a separating hyperplane between the ellipsoid and the obstacle
spheres. However, it is possible that the seed does not lie in the half-space defined by the
generated hyperplanes. To address this issue, we propose a plane adjustment method to ensure
that the generated polyhedron includes the seed (line 6). As depicted in Fig. S7D, we first
verify if both sa and sb lie within the half-space defined by H. Without loss of generality, if the
seed point sa is located outside H, we adjust H to a new hyperplane Hadj that passes through
point sa , which is tangent to the obstacle sphere, and points towards another seed point sb .
This adjustment ensures that the new hyperplane Hadj contains both sa and sb . We then set the
hyperplane generated in this round as the new H, denoted as Hadj . After the plane adjustment
process (if necessary), all obstacle points outside H are excluded from Oa , and H is added to
the polyhedron P. This process is repeated until all obstacle points are excluded.
Maximum Volume Inscribed Ellipsoid
Given the polyhedron (a set of hyperplanes), this step involves updating the ellipsoid by deter-
mining the maximum volume inscribed ellipsoid (MVIE) within the given polyhedron. Assume
the polyhedron consists of K hyperplanes P = {H1 , . . . , HK } and the ellipsoid is define by
Eq. S1, the optimization problem can be formulated as:
max det(C),
C,d
(c) pc ⊆ S.
Proof. Assume that S is not entirely contained within the known-free space. This implies the
existence of a point p with an occupancy state that is either occupied or unknown. In the first
case, if p is in the occupied space, it must be captured in D, which contradicts condition (b). In
the second case, if p is in the unknown space, its depth must be greater than any point d ∈ D
with the same bearing direction. For the point d lies on the line connecting p and pc (the bearing
direction), the convexity of S implies that d ∈ S. This once again contradicts condition (b).
Hence, in both cases, we arrive at a contradiction to condition (b), leading to the conclusion that
S must be entirely contained within the known-free space.
where m is the mass of the drone, g represents the gravity vector with a fixed length of 9.81 m/s2 ,
and aT denotes the scalar thrust acceleration. The position and velocity in the inertial frame
are denoted by p ∈ R3 and v ∈ R3 , respectively. The attitude of the airframe is repre-
sented by R ∈ SO(3), and ω denotes angular velocity in the body frame. The operation
⌊·⌋ converts a vector into a skew-symmetric matrix. The operation ⌊·⌋ converts a vector into a
skew-symmetric matrix. D denotes the rotor drag coefficient matrix, which is parameterized as
D = diag(dh , dh , dv ).
As discussed in (64), the above quadrotor dynamic system with rotor drag is differentially
flat. The flat output vector is typically selected as σ = [p, ψ]T , where ψ represents the yaw
angle. This differential flatness property allows us to plan the trajectory of the flat output σ(t)
instead of the entire state trajectory s(t) of the quadrotor UAV.
System linearization
As previously described in our previous work (65), OMMPC takes the quadrotor system into a
compound manifold, and defines the system states and inputs as follows:
M = R3 × R3 × SO(3), dim(M) = 9,
(S7)
x = pI vI R ∈ M, u = aT ω B ∈ R4 ,
and the error between measured state x and reference state xd , lying on the manifold M, is
mapped into the local homeomorphic space (an open set in Euclidean space) around each point
xd using local coordinates. Particularly, the state error is defined as follows:
T
δx ≜ xd ⊟ x = δpT δvT δRT ∈ R9 ,
δp ≜ pd ⊟ p = pd − p ∈ R3 ,
(S8)
δv ≜ vd ⊟ v = vd − v ∈ R3 ,
δR ≜ Rd ⊟ R = Log(RT Rd ) ∈ R3 ,
and the original quadrotor system can be equivalently expressed by an error system where the
vehicle states are mapped to the local coordinates at each point along the reference trajectory.
Follow the derivation in (65), the resultant linearized error system is given by
where
I3 I3 ∆t 0
T
j k
∆t d T ∆t d T
Fx k = 0 I 3 − Rk DRdk Rk (2I3 − D) aT me3 + DRdk v − Rdk v ,
m m
0 0 Exp −ωkd ∆t
(S11)
0 0
Fu k = ∆t −Rdk e3 0 ,
T
0 A ωkd ∆t
where
sin ∥θ∥ ⌊θ⌋2
1 − cos ∥θ∥ ⌊θ⌋
A(θ) = I3 + + 1− . (S12)
∥θ∥ ∥θ∥ ∥θ∥ ∥θ∥2
Therefore, the OMMPC can be given by the current error and reference trajectory, leading
to an efficient QP solution:
N
X −1
∥δxk ∥2Q+∥δuk ∥2R +∥δxN ∥2P ,
min
δuk
k=0
(S13)
s.t. δxk+1 = Fxk δxk + Fuk δuk ,
δuk ∈ δUk , k = 0, · · · , N − 1,
where N is the predictive horizon, and positive-definite diagonal matrices Q, R, P denote the
penalty of the stage state, stage input, and terminal state, respectively. Uk = {δu ∈ R4 |umin −
udk ≤ δu ≤ umax − udk } denotes the constraints for the input error.
Experiment Results
To quantitatively assess the effectiveness of the proposed OMMPC incorporating the rotor-drag
model, we conducted a comparative analysis with the common quadrotor model (65) using the
PX4 SITL software with the Gazebo simulator (75). The simulator provides realistic dynamics
and air resistance simulations. We generated an offline polynomial trajectory on the flat space
denoted as σ(t) with a maximum velocity of 15 m/s. The trajectory consisted of a straight line
starting from p0 = [0, 0, 0]T and ending at pg = [120, 0, 0]T along the x-axis. Initially, we
generated the desired state trajectory using differential flatness under the common quadrotor
model and tracked the trajectory using the quadrotor OMMPC proposed in (65). As shown in
Fig. S13A, the UAV exhibited significant attitude tracking errors, and the position tracking error
was notably influenced by the velocity direction, reaching a maximum of 0.4 m. In contrast, by
employing the quadrotor model with rotor drag described in equation Eq. (S6), the OMMPC
effectively predicted and compensated for the drag, resulting in improved attitude-tracking per-
formance and a reduced position-tracking error of 0.05 m.
Supplementary Figures and Tables
A System breakdown
Other Avionics LiDAR Planning
Others Control
52 g, 3.4% 6.3 W, 1.5% 2.1 %
3.2 W, 0.7% 0.7 %
Airframe LiDAR Actuator Onboard Computer
510 g, 33.7% 379.7 W, 87.9% 43.3 W, 10.0% Free State
255 g, 16.8 %
86.0% Estimation
11.1 %
Battery
432 g, 28.5%
Battery
(Dural sky 3300mAh)
Flight control ESC
unit (Nxt PX4) (T-Motor F60A)
3D printing and Solid state LiDAR
carbonfiber rack frame (Livox Mid 360)
Fig. S1. System breakdown of SUPER (A) The weight, power, and CPU usage breakdown of SUPER. (B) The
hardware components of SUPER.
A B
Fig. S2. Scenario and trajectory of the endurance test (A) The testing scenario. (B) The point cloud view and
flying trajectory of the drone during the endurance test.
A Falco w/o Gimbal B Falco with Gimbal
Ground
Ground
Motion Primitives
FOV
Fig. S3. Illustration of Falco with and without gimbal structure (A) The original version of Falco’s open-
sourced implementation lacked a gimbal structure. In this configuration, when the multirotor accelerates, the
motion primitives are constrained to point downwards, prohibiting the drone from flying forward. (B) The modified
version of Falco with a gimbal structure. The LiDAR sensor maintains a horizontal orientation independent of the
drone’s attitude.
Fig. S4. Evaluation of success rate, and efficiency (A) The success rate of the benchmarked methods at different
obstacle densities (measured as traversibility) and flight speeds. Empty columns indicate that the corresponding
combination of speed and density was not achieved. (B) Flight results of the benchmarked methods in 1080
experiments. (C) Time consumption of the benchmarked methods, with squares representing the mean values and
error bars indicating the standard deviations of the total computation time. Each mean and standard deviation is
computed from 180 tests.
A Third person view B Side view
LiDAR
FOV
Fig. S5. Visual example of thin wire detection on OGM. (A) Third-person view of the MAV facing a thin wire
(shown in green). (B) Side view of the detection process. The gray box represents the cell in the occupancy grid
map (OGM) that contains the thin wire. The blue lines indicate the LiDAR beams that pass through the cell without
hitting the wire, while the red line indicates the beam that hits the wire. Since most of the beams do not detect the
wire, the cell is incorrectly classified as free space.
Fig. S6. Evaluation of planning module of Faster and SUPER (A) (i) A top-down view of the simulation envi-
ronment, where a red cylinder is hidden behind the corner. (ii-iv) Comparison of the executed trajectories for the
three methods: Faster (23), the proposed SUPER, and SUPER without switching time optimization (referred to as
SUPER without OT). (B) Velocity profile and execution of backup trajectories in the simulation. (C) Switching
time ts of Faster and SUPER given the same exploratory trajectory and backup corridor. (D) (i) Real-world experi-
ment of SUPER, with the hidden obstacle highlighted by a red dashed box. (ii) First-person view from the onboard
camera at t2 . (E) Planned trajectories at time t1 and t2 . Te1 and Tb1 are the exploratory and backup trajectories at
t1 , respectively, and Te2 is the exploratory trajectory at t2 . The backup trajectory at t2 is not displayed.
(F) Velocity profile in the experiment.
Fig. S7. Illustration of the proposed flight corridor generation algorithm (CIRI). (A) Comparison of three
different methods for generating a safe flight corridor in configuration space. (B) Example illustrating the trans-
formation between the ellipsoid frame E and the world frame W. (C) Visual depiction of the process of convex
decomposition in the C-space. (D) Example demonstrating the plane adjustment strategy to ensure that the given
seeds lie within the generated polyhedron.
A
Pillars OFR 0.01~0.6 Forest OFR: 0.005 ~ 0.3 Perlin OFR 0.2~0.6
CIRI
FIRI-Inflation
RILS-Inflation
FIRI-Shrinkage
RILS-Shrinkage
Fig. S8. Comparison of convex decomposition in configuration space. (A) Example benchmark environments
with obstacle filling rates (OFR) ranging from 0.005 to 0.6. (B) Seed containment rate of generated polyhedra
by different methods across OFR. (C) Boxplot showing the average polyhedron volumes generated by different
methods across OFR, with each plot based on 100 tests. The central bar represents the median. (D) Boxplot of
computation times for the various methods across OFR, with each plot derived from 100 tests. The central bar
indicates the median.
Fig. S9. The plot of the barrier function Lµ . The plot illustrates the function Lµ . As the parameter µ decreases
from µ = 0.5 to µ = 0.01, the barrier function becomes progressively harder, with a sharper shape.
Fig. S10. The plot of the mapping from η to ts . The plot illustrates a sigmoid-like function
that maps η ∈ R to the interval (tc , to ).
Fig. S11. Backup corridor generation of limited FOV (A) The largest convex subset of a non-convex FOV is
formed by the intersection of two half-spaces defined by the hyperplanes tangent to the FOV’s upper boundary
Hu and lower boundary Hl . (B) Side view of the hyperplanes Hu and Hl along with the vertical FOV angle,
θ. (C) Top-down view showing the convex subset and the two hyperplanes, which are uniquely determined by
the heading direction h. (D) The heading direction is aligned with the exploratory trajectory to ensure that the
generated backup corridor contains the initial portion of the exploratory trajectory.
A(i) A(ii) A(iii) Traversability: 3.1
A(ii) A(iii)
Fig. S12. Benchmark comparison with different vertical FOV (A) Benchmarking results on simulation maps
with a traversability score of 3.1. (B) Benchmarking results on simulation maps with a traversability score of
6.5. For both (A) and (B), the distributions (ii-iii) illustrate the average flight speed and the ratio of executing the
backup trajectory. Error bars represent the standard deviation, while the black points indicate the mean values.
Each distribution is based on data from 180 tests.
Common Model Drag Effect Aware Model
Speed [m/s]
Measurement Measurement
Reference Reference
Pitch [deg.]
Measurement Measurement
Reference Reference
Position Error [m]
Fig. S13. Comparison of tracking performance with different drone models of MPC. The plot of speed, pitch
angle, and position tracking error in each axis.
Table S1. Implementation details of different methods
Movie 1. An overview of the proposed SUPER system. SUPER demonstrates its ability to safely navigate
through unknown, cluttered environments at high speeds, avoid thin obstacles like power lines, and perform
robustly in various scenarios, including object tracking and autonomous exploration.
2. B. Mishra, D. Garg, P. Narang, V. Mishra, Drone-surveillance for search and rescue in natural
disaster. Comput. Commun. 156, 1–10 (2020).
4. B. Rabta, C. Wankmüller, G. Reiner, A drone fleet model for last-mile distribution in disaster
relief operations. Int. J. Disaster Risk Reduct. 28, 107–112 (2018).
6. P. Foehn, A. Romero, D. Scaramuzza, Time-optimal planning for quadrotor waypoint flight. Sci.
Robot. 6, eabh1221 (2021).
8. A. Romero, S. Sun, P. Foehn, D. Scaramuzza, Model predictive contouring control for time-
optimal quadrotor flight. IEEE Trans. Robot. 38, 3340–3356 (2022).
10. J. Zhang, R. G. Chadha, V. Velivela, S. Singh, P-cal: Pre-computed alternative lanes for
aggressive aerial collision avoidance, paper presented at the 12th International Conference on
Field and Service Robotics (FSR), Tokyo, Japan, 31 August 2019.
11. J. Zhang, R. G. Chadha, V. Velivela, S. Singh, P-cap: Pre-computed alternative paths to enable
aggressive aerial maneuvers in cluttered environments, in 2018 IEEE/RSJ International
Conference on Intelligent Robots and Systems (IROS) (IEEE, 2018), pp. 8456–8463.
12. Y. Ren, S. Liang, F. Zhu, G. Lu, F. Zhang, Online whole-body motion planning for quadrotor
using multi-resolution search, in 2023 IEEE International Conference on Robotics and
Automation (ICRA) (IEEE, 2023), pp. 1594–1600.
13. Y. Ren, F. Zhu, W. Liu, Z. Wang, Y. Lin, F. Gao, F. Zhang, Bubble planner: Planning
highspeed smooth quadrotor trajectories using receding corridors, in 2022 IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS) (IEEE, 2022), pp. 6332–
6339.
14. X. Zhou, Z. Wang, H. Ye, C. Xu, F. Gao, Ego-planner: An esdf-free gradient-based local
planner for quadrotors. IEEE Robot. Autom. Lett. 6, 478–485 (2020).
15. H. Ye, X. Zhou, Z. Wang, C. Xu, J. Chu, F. Gao, Tgk-planner: An efficient topology guided
kinodynamic planner for autonomous quadrotors. IEEE Robot. Autom. Lett. 6, 494–501 (2020).
18. L. Quan, Z. Zhang, X. Zhong, C. Xu, F. Gao, Eva-planner: Environmental adaptive quadrotor
planning, in 2021 IEEE International Conference on Robotics and Automation (ICRA) (IEEE,
2021), pp. 398–404.
19. L. Wang, Y. Guo, Speed adaptive robot trajectory generation based on derivative property of b-
spline curve. IEEE Robot. Autom. Lett. 8, 1905–1911 (2023).
20. B. Zhou, J. Pan, F. Gao, S. Shen, Raptor: Robust and perception-aware trajectory replanning for
quadrotor fast flight. IEEE Trans. Robot. 37, 1992–2009 (2021).
21. H. Oleynikova, Z. Taylor, R. Siegwart, J. Nieto, Safe local exploration for replanning in
cluttered unknown environments for microaerial vehicles. IEEE Robot. Autom. Lett. 3, 1474–
1481 (2018).
22. S. Liu, M. Watterson, S. Tang, V. Kumar, High speed navigation for quadrotors with limited
onboard sensing, in 2016 IEEE International Conference on Robotics and Automation (ICRA)
(IEEE, 2016), pp. 1484–1491.
23. J. Tordesillas, B. T. Lopez, M. Everett, J. P. How, Faster: Fast and safe trajectory planner for
navigation in unknown environments. IEEE Transact. Robot. 38, 922–938 (2021).
25. Livox Technology Company Limited, Livox Mid-360 User Manual (2023);
https://livoxtech.com/mid-360.
26. Wikipedia, Lockheed Martin F-35 Lightning II stealth multirole combat aircraft (2024);
https://en.wikipedia.org/wiki/Lockheed_Martin_F-35_Lightning_II.
28. J. Tordesillas, B. T. Lopez, J. P. How, Faster: Fast and safe trajectory planner for flights in
unknown environments, in 2019 IEEE/RSJ International Conference on Intelligent Robots and
Systems (IROS) (IEEE, 2019), pp. 1934–1940.
29. Z. Wang, X. Zhou, C. Xu, F. Gao, Geometrically constrained trajectory optimization for
multicopters. IEEE Trans. Robot. 38, 3259–3278 (2022).
32. B. Tang, Y. Ren, F. Zhu, R. He, S. Liang, F. Kong, F. Zhang, Bubble explorer: Fast uav
exploration in large-scale and cluttered 3D-environments using occlusion-free spheres.
arXiv:2304.00852 [cs.RO] (2023).
34. NVIDIA Corporation, Nvidia Jetson TX2 embedded AI computing device (2024);
https://developer.nvidia.com/embedded/jetson-tx2.
35. X. Zhou, X. Wen, Z. Wang, Y. Gao, H. Li, Q. Wang, T. Yang, H. Lu, Y. Cao, C. Xu, F. Gao,
Swarm of micro flying robots in the wild. Sci. Robot. 7, eabm5954 (2022).
37. A. Harmat, M. Trentini, I. Sharf, Multi-camera tracking and mapping for unmanned aerial
vehicles in unstructured environments. J. Intell. Robot. Syst. 78, 291–317 (2015).
39. W. Xu, Y. Cai, D. He, J. Lin, F. Zhang, Fast-lio2: Fast direct lidar-inertial odometry. IEEE
Trans. Robot. 38, 2053–2073 (2022).
42. J. Zhang, C. Hu, R. G. Chadha, S. Singh, Falco: Fast likelihood-based collision avoidance with
extension to human-guided navigation. J. Field Robot. 37, 1300–1313 (2020).
45. Y. Cai, F. Kong, Y. Ren, F. Zhu, J. Lin, F. Zhang, Occupancy grid mapping without raycasting
for high-resolution LiDAR sensors. IEEE Trans. Robot. 40, 172–192 (2023).
51. Y. Ren, Y. Cai, F. Zhu, S. Liang, F. Zhang, Rog-map: An efficient robocentric occupancy grid
map for large-scene and high-resolution lidar-based motion planning. arXiv:2302.14819 [cs.RO]
(2023).
52. F. Kong, W. Xu, Y. Cai, F. Zhang, Avoiding dynamic small obstacles with onboard sensing and
computation on aerial robots. IEEE Robot. Autom. Lett. 6, 7869–7876 (2021).
53. H. Wu, Y. Li, W. Xu, F. Kong, F. Zhang, Moving event detection from LiDAR point streams.
Nat. Commun. 15, 345 (2024).
54. T-MOTOR, T-MOTOR F90 series racing drone motor specifications (2024);
https://store.tmotor.com/goods-1064-F90.html.
55. Dronecode Foundation, PX4: Open source autopilot for drone developers (2024); https://px4.io.
56. J. Chen, K. Su, S. Shen, Real-time safe trajectory generation for quadrotor flight in cluttered
environments, in 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)
(IEEE, 2015), pp. 1678–1685.
57. P. Hart, N. Nilsson, B. Raphael, A formal basis for the heuristic determination of minimum cost
paths. IEEE Trans. Syst. Sci. Cybern. 4, 100–107 (1968).
58. Q. Wang, Z. Wang, C. Xu, F. Gao, Fast iterative region inflation for computing large 2-d/3-d
convex regions of obstacle-free space. arXiv:2403.02977 [cs.RO] (2024).
59. R. Deits, R. Tedrake, Computing large convex regions of obstacle-free space through
semidefinite programming, in Algorithmic Foundations of Robotics XI: Selected Contributions of
the Eleventh International Workshop on the Algorithmic Foundations of Robotics (Springer,
2015), pp. 109–124.
60. L. Yin, F. Zhu, Y. Ren, F. Kong, F. Zhang, Decentralized swarm trajectory generation for lidar-
based aerial tracking in cluttered environments, in 2023 IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS) (IEEE, 2023), pp. 9285–9292.
61. J. Ji, N. Pan, C. Xu, F. Gao, Elastic tracker: A spatio-temporal trajectory planner for flexible
aerial tracking, in 2022 International Conference on Robotics and Automation (ICRA) (IEEE,
2022), pp. 47–53.
62. S. Liu, N. Atanasov, K. Mohta, V. Kumar, Search-based motion planning for quadrotors using
linear quadratic minimum time control, in 2017 IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS) (IEEE, 2017), pp. 2872–2879.
63. ZJU FAST Lab, LBFGS-Lite: A header-only L-BFGS unconstrained optimizer (Github, 2024);
https://github.com/ZJU-FAST-Lab/LBFGS-Lite.
65. G. Lu, W. Xu, F. Zhang, On-manifold model predictive control for trajectory tracking on
robotic systems. IEEE Trans. Ind. Electron. 70, 9192–9202 (2022).
66. A. Koubaa, Ed., Robot Operating System (ROS): The Complete Reference (Volume 1), vol. 625
of Studies in Computational Intelligence (Springer, 2017).
67. W. Liu, Y. Ren, F. Zhang, Integrated planning and control for quadrotor navigation in presence
of suddenly appearing objects and disturbances. IEEE Robot. Autom. Lett. 9, 899–906 (2023).
68. C. Toumieh, A. Lambert, Voxel-grid based convex decomposition of 3D space for safe corridor
generation. J. Intell. Robot. Syst. 105, 87 (2022).
69. X. Zhong, Y. Wu, D. Wang, Q. Wang, C. Xu, F. Gao, Generating large convex polytopes
directly on point clouds. arXiv:2010.08744 [cs.RO] (2020).
73. Y. Cui, D. Sun, K.-C. Toh, On the r-superlinear convergence of the KKT residuals generated by
the augmented lagrangian method for convex composite conic programming. Math. Program.
178, 381–415 (2019).
75. Dronecode Foundation, PX4 software in the loop simulation with Gazebo (2024);
https://docs.px4.io/v1.12/en/simulation/gazebo.html.