Sensors 23 07650
Sensors 23 07650
Article
Drone Detection and Tracking Using RF Identification Signals
Driss Aouladhadj 1,2, * , Ettien Kpre 2 , Virginie Deniau 1 , Aymane Kharchouf 2 , Christophe Gransart 1
and Christophe Gaquière 2
1 COSYS-LEOST, Université Gustave Eiffel, 20 Rue Élisée Reclus, 59650 Villeneuve-d’Ascq, France;
[email protected] (V.D.); [email protected] (C.G.)
2 MC2 Technologies, 1 Rue Héraclès, 59493 Villeneuve-d’Ascq, France; [email protected] (E.K.);
[email protected] (A.K.); [email protected] (C.G.)
* Correspondence: [email protected]
Abstract: The market for unmanned aerial systems (UASs) has grown considerably worldwide,
but their ability to transmit sensitive information poses a threat to public safety. To counter these
threats, authorities, and anti-drone organizations are ensuring that UASs comply with regulations,
focusing on strategies to mitigate the risks associated with malicious drones. This study presents a
technique for detecting drone models using identification (ID) tags in radio frequency (RF) signals,
enabling the extraction of real-time telemetry data through the decoding of Drone ID packets. The
system, implemented with a development board, facilitates efficient drone tracking. The results of a
measurement campaign performance evaluation include maximum detection distances of 1.3 km for
the Mavic Air, 1.5 km for the Mavic 3, and 3.7 km for the Mavic 2 Pro. The system accurately estimates
a drone’s 2D position, altitude, and speed in real time. Thanks to the decoding of telemetry packets,
the system demonstrates promising accuracy, with worst-case distances between estimated and actual
drone positions of 35 m for the Mavic 2 Pro, 17 m for the Mavic Air, and 15 m for the Mavic 3. In
addition, there is a relative error of 14% for altitude measurements and 7% for speed measurements.
The reaction times calculated to secure a vulnerable site within a 200 m radius are 1.83 min (Mavic
Air), 1.03 min (Mavic 3), and 2.92 min (Mavic 2 Pro). This system is proving effective in addressing
emerging concerns about drone-related threats, helping to improve public safety and security.
Citation: Aouladhadj, D.; Kpre, E.;
Deniau, V.; Kharchouf, A.; Gransart,
Keywords: drone; UAV; C-UAS; RF signal; Drone ID; detection system; tracking system; drone
C.; Gaquière, C. Drone Detection and
position; distance estimation; reaction time
Tracking Using RF Identification
Signals. Sensors 2023, 23, 7650.
https://doi.org/10.3390/s23177650
• Large RCS;
RCS reflection or • Confusion with
micro-doppler • Long detection range;
other flying objects,
RaDaR signature-based detection, • 360-degree coverage; [18,29,30]
such as birds;
with a bandwidth used from • All-weather operation.
• LoS is required;
3 MHz to 300 GHz. • Expensive.
Sensors 2023, 23, 7650 3 of 24
Table 1. Cont.
The passive RF detection method relies on spectral surveillance to identify the com-
munications between the drone and its remote control (RC) within the electromagnetic
spectrum. These methods do not require LoS. However, passive RF detection faces chal-
lenges when the signals emitted by the drone coexist with numerous other signals, such
as Wi-Fi or Bluetooth, which share the same frequency band. This presents a challenge
in attributing each signal to its respective emitter, especially in complex urban environ-
ments. Consequently, it necessitates the collection of RF communication data from each
RF source and the construction of a comprehensive database encompassing various sce-
narios and diverse environments to enhance detection capabilities for more general cases.
To address these challenges, recent studies have made notable contributions. In 2019,
Al-Sa’d et al. [34] introduced an open-source drone database for RF-based detection and
identification, demonstrating the effectiveness of deep neural networks in achieving high
accuracy rates. In 2020, Feng et al. [35] proposed an efficient two-step method for detect-
ing drone hijacking using a combination of genetic algorithm-extreme gradient boosting
(GA-XGBoost) with GPS and inertial measurement unit (IMU) data, achieving high pre-
diction correctness and time savings. In 2021, the study conducted by Basak et al. [31]
introduced a two-stage approach. The detection stage employed goodness-of-fit (GoF)
sensing, while the classification stage utilized the deep recurrent neural network (DRNN)
framework. They developed a customized you only look once (YOLO)-lite framework
from scratch to achieve integrated drone RF signal detection, spectrum localization, and
classification. The performance of both techniques was evaluated using a newly created
drone dataset, demonstrating favorable results in terms of detection and classification.
However, it is important to note that since the classification was conducted in a supervised
manner, the performance may vary when encountering unknown or newer drone signals,
as highlighted in the limitations. In 2022, Medaiyese et al. [36] employed wavelet transform
analytics and machine learning for RF-based UAV detection, achieving 98.9% accuracy with
an image-based signature and a pre-trained convolutional neural network (CNN)-based
model. Kılıç et al. [37] also focused on drone classification based on RF signals, achieving
high accuracy rates using spectral-based audio features and a support vector machine
(SVM)-based machine learning algorithm. In the same year, Sazdic-Jotic et al. [38] pre-
sented a method for single and multiple drone detection and identification using RF-based
deep learning algorithms, achieving high accuracy in both scenarios.
Previous methods used for drone detection and classification exhibit suboptimal
performance. First, to integrate new drones into the full database, it is necessary to study
and record all potential communication scenarios and take into account different channel
and multipath models [32,34,39]. However, this approach can introduce limitations in
terms of flexibility and operational efficiency. Moreover, they rely heavily on AI algorithms
that operate on large data sets, resulting in a procedure that requires significant training
Sensors 2023, 23, 7650 4 of 24
signals, allowing for seamless information exchange. It enables the transfer of precise
control commands, encompassing throttle, pitch, yaw, and roll, to ensure accurate maneu-
vering of the UAV. Furthermore, it facilitates the exchange of crucial information with the
pilot, including UAV position, remaining flight time, distance to target and pilot, payload
specifics, speed, altitude, and video imagery. Additionally, it supports the transmission of
flight missions, acknowledgments, and protocol-dependent data, expanding the scope of
control commands beyond the drone’s speed coordinates.
The drone transmission system typically operates in the industrial, scientific, and
medical (ISM) bands, and the frequency choice depends on the geographical location of the
drone. For example, in France, the 2.4 GHz band offers a wide coverage area but a slower
data transmission speed, while the 5.8 GHz band provides a faster data speed but a more
limited coverage area.
Several communication protocols can be used to establish the RF link between the
drone and its RC, including Wi-Fi, enhanced Wi-Fi, Lightbridge, and OcuSync. The drone’s
range, video transmission quality, latency, available control frequencies, and other related
parameters all depend heavily on the communication protocol employed.
Matrice 600 Pro. Lightbridge-equipped drones are capable of transmitting signals over an
extended range, reaching up to 3.6 km in countries subject to CE regulations.
Lightbridge drones leverage advanced features to deliver high-quality video trans-
mission and responsive control for professional aerial photography, cinematography, and
industrial applications. With this protocol, DJI has significantly improved the communica-
tion capabilities of their drones compared to the Wi-Fi and the enhanced Wi-Fi protocols,
offering professionals a reliable and efficient tool for their work [45].
Figure 2. Detection and tracking scenario: downlink (video and telemetry)/uplink (control).
Thus, regardless of whether a drone complies with regulations or not, the system
should detect all drones operating within its vicinity. Taking into account these different
drones, we break down the problem into three detection cases:
• Drones that communicate using the Wi-Fi standard protocol within the ISM band.
• Drones that transmit a Wi-Fi RDID beacon on channel 6 (at 2437 MHz) within the
2.4 GHz Wi-Fi band.
• DJI drones that transmit the enhanced Wi-Fi DJI Drone ID signal. The specific channel
used by these drones is pseudo-random and can be within either the 2.4 GHz or
5.8 GHz ISM band.
Sensors 2023, 23, 7650 8 of 24
This article does not cover DJI drones emitting the OcuSync DJI Drone ID [41], nor
drones using communication protocols for which decoding methods are currently under-
going reverse-engineering processes.
On the other hand, the Panda Wi-Fi card offers a coaxial SMA RF connector for
connecting an optimized Rx chain, and supports only 2.4 GHz frequencies, making it ideal
for long-range wireless network deployments. Moreover, it offers a throughput of up to
300 MB per second. Both Wi-Fi cards can monitor and intercept Wi-Fi packets, enabling
access to the drone’s data. Wireless card performance has a significant impact on range and
accuracy. Choosing the right Wi-Fi card is, therefore, essential to build an effective drone
detection and tracking system.
Sensors 2023, 23, 7650 9 of 24
6.1.2. Wireshark
Wireshark is an open-source network analysis tool. It decodes captured frames and
understands the different structures of communication protocols. In this work, Wireshark
is employed to parse and decode the RDID packet at 2437 MHz.
6.1.3. Kismet
Kismet is an open-source wireless network and sniffer that identifies and associates
access points with wireless clients without emitting detectable frames. It employs a radio
channel hopping algorithm to determine the maximum number of available networks. The
Sensors 2023, 23, 7650 10 of 24
hopping sequence, which can be customized, enables capturing more packets. In this work,
Kismet supports the parsing and decoding of enhanced Wi-Fi DJI Drone ID frames due to
its frequency hopping modulation capabilities.
about the captured packet, such as the interface, ID, and length. The IEEE 802.11 beacon
frame indicates that this packet contains wireless management information. The fixed
parameters include the timestamp, beacon interval, and capabilities information. The
tagged parameters provide additional details about the drone, such as the service set iden-
tifier (SSID)(DJI-1581###############), latitude, longitude, altitude above mean sea level
(AMSL), altitude above ground level (AGL), latitude takeoff, longitude takeoff, horizontal
speed, and heading. Furthermore, there are vendor-specific tags that provide information
about the drone manufacturer and serial number.
Figure 7. The Output PCAP file from decoding the Mavic 2 Pro RDID packet. Sensitive information
was partly replaced by ‘#’ symbol, such as location, MAC address, and serial number.
Figure 8. Monitor FHSS 5 MHz Wi-Fi signals using Kismet on the 2.4 GHz and 5.8 GHz bands.
Sensors 2023, 23, 7650 12 of 24
Figure 9. The Output JSON file from decoding DJI enhanced Wi-Fi Drone ID. Sensitive information
was partly replaced by ’#’ symbol, such as location, MAC address, and serial number.
This packet contains information about a DJI drone, including its serial number, model,
and frequency histogram usage. Additionally, telemetry data are included such as the
drone’s pitch, yaw, and roll, as well as its speed and altitude. It also includes a timestamp
indicating when the telemetry data were recorded. This information can be used to track
and analyze drone movements and behavior.
The Python code connects to a Kismet server using the kismet_rest API to retrieve
information about a wireless detected device with a specific MAC address. As shown
in the flowchart (Figure 10), the code performs several steps. First, the code imports the
required kismet_rest module. Then, it creates a KismetConnector instance and establishes
a connection to the Kismet server. Next, the code defines a list of MAC address masks,
stored in the mac_list variable, for the devices that need to be detected. If a device is
detected, the code retrieves information such as the device name, full MAC address, and
manufacturer using the kismet_rest.Devices.by_mac function and providing the MAC list.
Additionally, the code employs the kismet.device.base.seenby function to access multiple
dictionary attributes.
Sensors 2023, 23, 7650 13 of 24
7. Experimental Setup
In this section, we present the experimental setup employed to examine the detection
and tracking capabilities of drones. As illustrated in Figure 11, the system incorporates
both active and passive components within its RF chains. The specifications of the setup
RF components are detailed in Table 2. The receiver chain is linked to the Jetson device
for processing.
Figure 12 illustrates the drones tested in the experiments. These drones are the DJI
Mavic Air, which is a Wi-Fi drone equipped with the enhanced Wi-Fi DJI Drone ID, and the
DJI Mavic 2 Pro and DJI Mavic 3, which are OcuSync drones equipped with an RDID. Each
drone selected employs a specific RF protocol and then represents a different category of
drone. Their specific characteristics are outlined in Table 3. The drones chosen for the study
are representative of many popular drones on the market due to their use of the recent RF
protocols (Wi-Fi, enhanced Wi-Fi, and OcuSync). By focusing on these RF protocols, rather
than on several specific drone models, our results can be extrapolated to a wide variety of
drone models that share the same communication standards.
(a) DJI Mavic Air (b) DJI Mavic 2 Pro (c) DJI Mavic 3
Figure 12. Drone models used in the experiments.
Figure 13. Outdoor mapping for long-distance drone experiments: system location (yellow spot) and
pilot/drone positions (white spots).
We positioned the reference point at the location of the detection system, which is
denoted by a yellow spot in Figure 13. The detection system is equipped with two omni-
directional receiving antennas and placed in an open-field environment. These antennas
were mounted on 2.6 m tripods. The test bed perspective is shown in Figure 14.
Each time, the drone and the RC were positioned at different locations, indicated by
white spots on the map in Figure 13. For some drones, we added between the white spots
other positions for more measurements.
(a) Detected range of DJI Mavic Air (b) Detected range of DJI Mavic 3
where ( xestimated ( P), yestimated ( P)) are the estimated coordinates, and ( x ( P), y( P)) are the
precise coordinates of the position of the drone P.
Figure 17 shows the Euclidean distance error between the estimated and the real positions.
Sensors 2023, 23, 7650 17 of 24
Figure 16. Precise drone positions, estimated drone positions, and the Haversine distances from the
system estimation.
Figure 17. Euclidean distance error between real and estimated positions.
Sensors 2023, 23, 7650 18 of 24
8.2.3. Estimation of Remaining Distance between the Drone and the System
To find out how far the drone is from the system, we use the Haversine equation
(Equation (3)) [59]. The calculated distances are shown in Figure 16.
2 ∆ϕ
2 ∆λ
a = sin 2 + cos ( ϕ 1 ) cos ( ϕ 2 ) sin 2
√ √
(3)
c = 2 × atan2 a, 1 − a
d = R
earth×c
The Haversine equation calculates the distance d between two points on Earth, where
ϕ1 and ϕ2 are the latitudes of the points, and ∆ϕ and ∆λ represent the differences in latitude
and longitude, respectively, and Rearth represents the Earth’s radius. This formula considers
the spherical Earth’s shape. Indeed, the traditional Euclidean distance calculations, being
based on flat Cartesian coordinates, are not accurate for long distances. By considering
Earth’s curvature, the Haversine formula provides more accurate distance measurements.
Xdecoded ( P) − Xprecise ( P)
ξ X ( P) = 100 × (4)
Xprecise ( P)
Equation (4) represents the relative error ξ X ( P) for parameter X at position P. Here,
Xdecoded ( P) denotes the decoded value of the parameter X at position P, and Xprecise ( P)
signifies the reference parameter X at position P. As the pilot was in proximity to the drone
at each position, we assume that the parameters received by the pilot and recorded in the
phone’s telemetry history of the pilot represent accurate information about the flight. Thus,
Xprecise ( P) is extracted from the telemetry information in the phone’s flight history. On
the other hand, the measured parameters represent the decoded values provided by the
detection system Xdecoded ( P).
To assess the errors in altitude and speed, we substitute X with the altitude H and the
speed V of the tracked drone, estimated by the system. The formula becomes Equation (5):
The relative altitude and speed error curves obtained from this formula, expressed
as a percentage for each position, are shown in Figure 18. The blue curve corresponds to
the Mavic Air drone, the red curve represents the Mavic 2 Pro drone, and the green curve
represents the Mavic 3 drone.
ddrone − rsite
Tto-react = (6)
vdrone
Sensors 2023, 23, 7650 19 of 24
where Tto-react represents the remaining time available to react, ddrone is the distance from
the drone to the vulnerable site, rsite is the fixed radius of the site to protect, and vdrone
is the maximum linear speed of the drone. This equation provides valuable insights into
the time required to take appropriate actions based on the drone’s position and maximum
linear speed. It offers a simplified representation of the remaining reaction time once the
drone is detected.
Equation (6) considers a worst-case scenario in which the drone travels at its maximum
speed and approaches the vulnerable site along a direct linear trajectory.
(a) Front side of the system (b) Lateral side of the system
Figure 19. Front and lateral sides of the system.
Author Contributions: Conceptualization, D.A. and E.K.; methodology, D.A. and V.D.; software, D.A.
and A.K.; validation, D.A., A.K., E.K. and V.D.; formal analysis, D.A.; investigation, D.A., E.K. and
A.K.; resources, D.A., E.K. and A.K.; data curation, D.A.; writing—original draft preparation, D.A.;
writing—review and editing, V.D. and C.G. (Christophe Gransart); visualization, D.A.; supervision,
V.D.; project administration, C.G. (Christophe Gaquière).; funding acquisition, C.G. (Christophe
Gaquière). All authors have read and agreed to the published version of the manuscript.
Funding: This research was funded by the Association Nationale Recherche Technologie (ANRT) OF
FUNDER grant number (No. 2020/0355).
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Sensors 2023, 23, 7650 21 of 24
Abbreviations
References
1. Dilshad, N.; Hwang, J.; Song, J.; Sung, N. Applications and challenges in video surveillance via drone: A brief survey.
In Proceedings of the 2020 International Conference on Information and Communication Technology Convergence (ICTC),
Jeju Island, Republic of Korea, 21–23 October 2020; pp. 728–732.
2. Fine, J.D.; Litsey, E.M. Drone Laying Honey Bee Workers in Queen Monitoring Cages. J. Insect Sci. 2022, 22, 13. [CrossRef]
[PubMed]
3. Meng, S.; Guo, X.; Li, D.; Liu, G. The multi-visit drone routing problem for pickup and delivery services. Transp. Res. Part E
Logist. Transp. Rev. 2023, 169, 102990. [CrossRef]
4. Mora, P.; Araujo, C.A.S. Delivering blood components through drones: A lean approach to the blood supply chain. Supply Chain.
Forum Int. J. 2022, 23, 113–123.
5. Hiebert, B.; Nouvet, E.; Jeyabalan, V.; Donelle, L. The application of drones in healthcare and health-related services in north
america: A scoping review. Drones 2020, 4, 30. [CrossRef]
6. Hanover, D.; Loquercio, A.; Bauersfeld, L.; Romero, A.; Penicka, R.; Song, Y.; Cioffi, G.; Kaufmann, E.; Scaramuzza, D. Past,
Present, and Future of Autonomous Drone Racing: A Survey. arXiv 2023, arXiv:2301.01755.
7. Tang, J.; Chen, X.; Zhu, X.; Zhu, F. Dynamic reallocation model of multiple unmanned aerial vehicle tasks in emergent adjustment
scenarios. IEEE Trans. Aerosp. Electron. Syst. 2022, 59, 1139–1155. [CrossRef]
8. Tang, J.; Liu, G.; Pan, Q. A review on representative swarm intelligence algorithms for solving optimization problems: Applica-
tions and trends. IEEE/CAA J. Autom. Sin. 2021, 8, 1627–1643. [CrossRef]
9. Lykou, G.; Moustakas, D.; Gritzalis, D. Defending airports from UAS: A survey on cyber-attacks and counter-drone sensing
technologies. Sensors 2020, 20, 3537. [CrossRef] [PubMed]
10. Tatara, B.A. The Role of Law in Facing Asymmetric Warfare Through Illicit Drug Trafficking in Indonesia. J. Law Sci. 2023, 5, 1–9.
[CrossRef]
11. Evangelista, M.; Shue, H. The American Way of Bombing: Changing Ethical and Legal Norms, from Flying Fortresses to Drones; Cornell
University Press: Ithaca, NY, USA, 2014.
12. Michel, A.H. Counter-Drone Systems, 2nd ed.; Center for the Study of the Drone at Bard College: Annandale-On-Hudson, NY,
USA, 2019.
13. Congressional Research Service (CRS). Department of Defense Counter-Unmanned Aircraft Systems. Available online: https:
//sgp.fas.org/crs/weapons/IF11426.pdf (accessed on 31 August 2023).
14. Shi, X.; Yang, C.; Xie, W.; Liang, C.; Shi, Z.; Chen, J. Anti-drone system with multiple surveillance technologies: Architecture,
implementation, and challenges. IEEE Commun. Mag. 2018, 56, 68–74. [CrossRef]
15. Castrillo, V.U.; Manco, A.; Pascarella, D.; Gigante, G. A review of counter-UAS technologies for cooperative defensive teams of
drones. Drones 2022, 6, 65. [CrossRef]
16. Park, S.; Kim, H.T.; Lee, S.; Joo, H.; Kim, H. Survey on anti-drone systems: Components, designs, and challenges. IEEE Access
2021, 9, 42635–42659. [CrossRef]
17. Chiper, F.L.; Martian, A.; Vladeanu, C.; Marghescu, I.; Craciunescu, R.; Fratu, O. Drone detection and defense systems: Survey
and a software-defined radio-based solution. Sensors 2022, 22, 1453. [CrossRef] [PubMed]
18. Coluccia, A.; Parisi, G.; Fascista, A. Detection and classification of multirotor drones in radar sensor networks: A review. Sensors
2020, 20, 4172. [CrossRef]
19. Guvenc, I.; Koohifar, F.; Singh, S.; Sichitiu, M.L.; Matolak, D. Detection, tracking, and interdiction for amateur drones. IEEE
Commun. Mag. 2018, 56, 75–81. [CrossRef]
20. Zitar, R.A.; Mohsen, A.; Seghrouchni, A.E.; Barbaresco, F.; Al-Dmour, N.A. Intensive Review of Drones Detection and Tracking:
Linear Kalman Filter Versus Nonlinear Regression, an Analysis Case. Arch. Comput. Methods Eng. 2023, 30, 2811–2830. [CrossRef]
21. Kamanlı, A.F. Real Time Uav (Unmanned Vehicle) Tracking with Object Detection in the Air: From Simulation to Real Life
Application. Available at SSRN 4329687 . Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4329687
(accessed on 31 August 2023).
22. Zheleva, M.; Anderson, C.R.; Aksoy, M.; Johnson, J.T.; Affinnih, H.; DePree, C.G. Radio Dynamic Zones: Motivations, Challenges,
and Opportunities to Catalyze Spectrum Coexistence. IEEE Commu. Mag. 2023, 61, 156–162. [CrossRef]
23. He, Z.; Tan, T. Survey on Worldwide Implementation of Remote Identification and Discussion on Drone Identification in China.
In Proceedings of the 2021 IEEE 3rd International Conference on Civil Aviation Safety and Information Technology (ICCASIT),
Changsha, China, 20–22 October 2021; pp. 252–258. [CrossRef]
24. Zitar, R.A.; Al-Betar, M.; Ryalat, M.; Kassaymehd, S. A review of UAV Visual Detection and Tracking Methods. arXiv 2023,
arXiv:2306.05089
25. Aydin, B.; Singha, S. Drone Detection Using YOLOv5. Eng 2023, 4, 416–433. [CrossRef]
26. Svanström, F.; Alonso-Fernandez, F.; Englund, C. Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities. Drones 2022, 6, 317. [CrossRef]
27. Go, Y.J.; Choi, J.S. An Acoustic Source Localization Method Using a Drone-Mounted Phased Microphone Array. Drones 2021,
5, 75. [CrossRef]
28. Salvati, D.; Drioli, C.; Ferrin, G.; Foresti, G.L. Acoustic source localization from multirotor UAVs. IEEE Trans. Ind. Electron. 2019,
67, 8618–8628. [CrossRef]
Sensors 2023, 23, 7650 23 of 24
29. Gong, J.; Yan, J.; Li, D.; Kong, D.; Hu, H. Interference of radar detection of drones by birds. Prog. Electromagn. Res. M 2019,
81, 1–11. [CrossRef]
30. Ezuma, M.; Anjinappa, C.K.; Funderburk, M.; Guvenc, I. Radar cross section based statistical recognition of UAVs at microwave
frequencies. IEEE Trans. Aerosp. Electron. Syst. 2021, 58, 27–46. [CrossRef]
31. Basak, S.; Rajendran, S.; Pollin, S.; Scheers, B. Combined RF-based drone detection and classification. IEEE Trans. Cogn. Commun.
Netw. 2021, 8, 111–120. [CrossRef]
32. Allahham, M.S.; Al-Sa’d, M.F.; Al-Ali, A.; Mohamed, A.; Khattab, T.; Erbad, A. DroneRF dataset: A dataset of drones for RF-based
detection, classification and identification. Data Brief 2019, 26, 104313. [CrossRef]
33. Alam, S.S.; Chakma, A.; Rahman, M.H.; Bin Mofidul, R.; Alam, M.M.; Utama, I.B.K.Y.; Jang, Y.M. RF-Enabled Deep-Learning-
Assisted Drone Detection and Identification: An End-to-End Approach. Sensors 2023, 23, 4202. [CrossRef]
34. Al-Sa’d, M.F.; Al-Ali, A.; Mohamed, A.; Khattab, T.; Erbad, A. RF-based drone detection and identification using deep learning
approaches: An initiative towards a large open source drone database. Future Gener. Comput. Syst. 2019, 100, 86–97. [CrossRef]
35. Feng, Z.; Guan, N.; Lv, M.; Liu, W.; Deng, Q.; Liu, X.; Yi, W. Efficient drone hijacking detection using two-step GA-XGBoost. J.
Syst. Archit. 2020, 103, 101694. [CrossRef]
36. Medaiyese, O.O.; Ezuma, M.; Lauf, A.P.; Guvenc, I. Wavelet transform analytics for RF-based UAV detection and identification
system using machine learning. Pervasive Mob. Comput. 2022, 82, 101569. [CrossRef]
37. Kılıç, R.; Kumbasar, N.; Oral, E.A.; Ozbek, I.Y. Drone classification using RF signal based spectral features. Eng. Sci. Technol. Int. J.
2022, 28, 101028. [CrossRef]
38. Sazdić-Jotić, B.; Pokrajac, I.; Bajčetić, J.; Bondžulić, B.; Obradović, D. Single and multiple drones detection and identification
using RF based deep learning algorithm. Expert Syst. Appl. 2022, 187, 115928. [CrossRef]
39. Zhang, H.; Li, T.; Li, Y.; Li, J.; Dobre, O.A.; Wen, Z. RF-based drone classification under complex electromagnetic environments
using deep learning. IEEE Sens. J. 2023, 23, 6099–6108. [CrossRef]
40. Christof, T. DJI Wi-Fi Protocol Reverse Engineering. Bachelor’s Thesis, Institute of Networks and Security, Johannes Kepler
Universität Linz, Linz, Austria, November 2021.
41. Bender, C. DJI drone IDs are not encrypted. arXiv 2022, arXiv:2207.10795
42. Department 13, Anatomy of DJI’s Drone Identification Implementation, White Paper, Canberra, Australia, 2017. Available online:
https://petapixel.com/assets/uploads/2022/08/Anatomy-of-DJI-Drone-ID-Implementation1.pdf (accessed on 31 August 2023).
43. DJI Aeroscope. Available online: https://www.dji.com/fr/aeroscope (accessed on 19 July 2023).
44. Swinney, C.J.; Woods, J.C. Low-Cost Raspberry-Pi-Based UAS Detection and Classification System Using Machine Learning.
Aerospace 2022, 9, 738. [CrossRef]
45. heliguy™ Blog. DJI Transmission Systems: Wi-Fi, OcuSync, Lightbridge. Published Online on 1 March 2022. Available online:
https://www.heliguy.com/blogs/posts/dji-transmission-systems-wi-fi-ocusync-lightbridge (accessed on 31 August 2023).
46. Flynt, J. The DJI Transmission Systems OcuSync 2 vs. Lightbridge 2 . Published on 25 September 2020. Available online:
https://3dinsider.com/ocusync-2-vs-lightbridge-2/ (accessed on 31 August 2023).
47. Travel, E.W. What Is DJI Ocusync And How Does It Work? Expert World Travel, 4 February 2017. Available online:
https://store.dji.bg/en/blog/what-is-dji-ocusync-and-how-does-it-work#:~:text=Ocusync%20can%20transmit%20video%20
at,much%20data%20at%20longer%20distances (accessed on 7 April 2023).
48. TheDronestop. DJI Ocusync (What Is It, Why It’s so Important, Updates of Ocusync) . Published on 1 January 2023. Available
online: https://thedronestop.com/dji-ocusync-everything-you-need-to-know/ (accessed on 7 April 2023).
49. Belwafi, K.; Alkadi, R.; Alameri, S.A.; Hamadi, H.A.; Shoufan, A. Unmanned Aerial Vehicles’ Remote Identification: A Tutorial
and Survey. IEEE Access 2022, 10, 87577–87601. [CrossRef]
50. Tedeschi, P.; Al Nuaimi, F.A.; Awad, A.I.; Natalizio, E. Privacy-Aware Remote Identification for Unmanned Aerial Vehicles:
Current Solutions, Potential Threats, and Future Directions. IEEE Trans. Ind. Inform. 2023 . [CrossRef]
51. Friis, S. Open Drone ID Online GitHub Repository Version 2.0 Published on 6 April 2022. Available online: https://github.com/
opendroneid/opendroneid-core-c (accessed on 31 August 2023).
52. Intel Corporation. Intel Wireless AC 8265 Dual Band. Product Datasheet. Available online: https://www.intel.fr/content/
www/fr/fr/products/sku/94150/intel-dual-band-wirelessac-8265/specifications.html (accessed on 31 August 2023).
53. Panda Wireless. Panda Wireless PAU06 300Mbps, Centos, Kali Linux and Raspbian. Available online: https://www.amazon.fr/
Panda-Wireless-PAU06-Adaptateur-Raspbian/dp/B00JDVRCI0 (accessed on 31 August 2023).
54. Allan, A. List of MAC Addresses with Vendors Identities. Online GitHub Repository, Created on 2 February 2017. Available
online: https://gist.github.com/aallan/b4bb86db86079509e6159810ae9bd3e4 (accessed on 31 August 2023).
55. Wikipédia. Adresse MAC. Last Modification on 19 July 2023. Available online: https://fr.wikipedia.org/wiki/Adresse_MAC#:~:
text=Une%20adresse%20MAC%20(de%20l,Elle%20est%20unique%20au%20monde. (accessed on 31 August 2023).
56. Secrétariat Général de la Défense et de la Sécurité Nationale. OUI: 6A:5C:35, 2019. Available online: https://maclookup.app/
macaddress/6A5C35 (accessed on 31 August 2023). )
57. Kershaw, M. Drone ID. Online GitHub Repository. Available online: https://github.com/kismetwireless/kismet/blob/master/
kaitai_definitions_disabled/dot11_ie_221_dji_droneid.ksy (accessed on 31 August 2023).
58. Kershaw, M.; Dragorn. Online Resource: kismet_rest Documentation. Available online: https://kismet-rest.readthedocs.io/_/
downloads/en/latest/pdf/ (accessed on 31 August 2023).
Sensors 2023, 23, 7650 24 of 24
59. Andreou, A.; Mavromoustakis, C.X.; Batalla, J.M.; Markakis, E.K.; Mastorakis, G.; Mumtaz, S. UAV Trajectory Optimisation in
Smart Cities using Modified A* Algorithm Combined with Haversine and Vincenty Formulas. IEEE Trans. Veh. Technol. 2023, 72,
9757–9769. [CrossRef]
60. Matić, V.; Kosjer, V.; Lebl, A.; Pavić, B.; Radivojević, J. Methods for Drone Detection and Jamming. In Proceedings of the 10th
International Conference on Information Society and Technology (ICIST), Kopaonik, Serbia, 8–11 March 2020; pp. 16–21.
61. Abunada, A.H.; Osman, A.Y.; Khandakar, A.; Chowdhury, M.E.H.; Khattab, T.; Touati, F. Design and implementation of a
RF based anti-drone system. In Proceedings of the 2020 IEEE International Conference on Informatics, IoT, and Enabling
Technologies (ICIoT), Doha, Qatar, 2–5 February 2020; pp. 35–42.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.