Robust Low-Cost Drone Detection and Classification
Robust Low-Cost Drone Detection and Classification
Received XX Month, XXXX; revised XX Month, XXXX; accepted XX Month, XXXX; Date of publication XX Month, XXXX; date of
current version XX Month, XXXX.
Digital Object Identifier 10.1109/XXXX.2022.1234567
2
Institute for Communication Systems, Eastern Switzerland University of Applied Sciences, 8640 Rapperswil-Jona, Switzerland
3
Armasuisse Science + Technology, 3602 Thun, Switzerland
Corresponding author: Stefan Glüge (email: [email protected]).
ABSTRACT The proliferation of drones, or unmanned aerial vehicles (UAVs), has raised significant
safety concerns due to their potential misuse in activities such as espionage, smuggling, and infrastructure
disruption. This paper addresses the critical need for effective drone detection and classification systems that
operate independently of UAV cooperation. We evaluate various convolutional neural networks (CNNs)
for their ability to detect and classify drones using spectrogram data derived from consecutive Fourier
transforms of signal components. The focus is on model robustness in low signal-to-noise ratio (SNR)
environments, which is critical for real-world applications. A comprehensive dataset is provided to support
future model development. In addition, we demonstrate a low-cost drone detection system using a standard
computer, software-defined radio (SDR) and antenna, validated through real-world field testing. On our
development dataset, all models consistently achieved an average balanced classification accuracy of
≥ 85% at SNR > −12 dB. In the field test, these models achieved an average balance accuracy of
> 80%, depending on transmitter distance and antenna direction. Our contributions include: a publicly
available dataset for model development, a comparative analysis of CNN for drone detection under low
SNR conditions, and the deployment and field evaluation of a practical, low-cost detection system.
INDEX TERMS Deep neural networks, Robustness, Signal detection, Unmanned aerial vehicles
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/
VOLUME , 1
Glüge et al.:
amine the hierarchical relationships within the learned Bluetooth speaker, a Wi-Fi hotspot, and simultaneous Blue-
features. tooth and Wi-Fi interference. The dataset does not include
• We implement the models in a low-cost detection measurements without drones, which would be necessary
system and evaluate them in a field test. to evaluate a drone detection system. The results in [14]
show that Bluetooth signals are more likely to interfere
with detection and classification accuracy than Wi-Fi signals.
A. RELATED WORK
Overall, frequency domain features extracted from a CNN
A literature review on drone detection methods based on were shown to be more robust than time domain features
deep learning (DL) is given in [1] and [2]. Both works reflect in the presence of interference. In [15] the drone signals
the state of the art in 2024. Different DL algorithms are from the DroneDetect dataset were augmented with Gaussian
discussed with respect to the techniques used to detect drones noise and SDR recorded background noise. Hence, the pro-
based on visual, radar, acoustic, and RF signals. Given these posed approach could be evaluated regrading its capability
general overviews, we briefly summarise recent work based to detect drones. They trained a CNN end-to-end on the raw
on RF data, with a particular focus on the data side of the IQ data and report an accuracy of 99% for detection and
problem to motivate our work. between 72% and 94% for classification.
With the advent of DL-based methods, the data used The Cardinal RF dataset [16] consists of the raw time
to train models became the cornerstone of any detection series data from six drones + controller, two Wi-Fi and
system. Table 1 provides an overview of openly available two Bluetooth devices. Based on this dataset, Medaiyese
datasets of RF drone signals. The DroneRF dataset [3] is et al. [17] proposed a semi-supervised framework for UAV
one of the first openly available datasets. It contains RF detection using wavelet analysis. Accuracy between 86% and
time series data from three drones in four flight modes 97% was achieved at SNRs of 30 dB and 18 dB, while it
(i.e. on, hovering, flying, video recording) recorded by two dropped to chance level for SNRs below 10 dB to 6 dB. In
universal software radio peripheral (USRP) SDR transceivers addition, [18] investigated different wavelet transforms for
[4]. The dataset is widely used and enabled follow-up work the feature extraction from the RF signals. Using the wavelet
with different approaches to classification systems, i.e. DL- scattering transform from the steady state of the RF signals
based [5], [6], focused on pre-processing and combining at 30 dB SNR to train SqueezeNet [19], they achieved an
signals from two frequency bands [7], genetic algorithm- accuracy of 98.9% at 10 dB SNR.
based heterogeneous integrated k-nearest neighbour [8], and In our previous work [20], we created the noisy drone
hierarchical reinforcement learning-based [9]. In general, RF signals dataset1 from six drones and four remote con-
the classification accuracies reported in the papers on the trollers. It consists of non-overlapping signal vectors of
DroneRF dataset are close to 100%. Specifically, [4], [5], 16384 samples, corresponding to ≈ 1.2 ms at 14 MHz. We
and [6] report an average accuracy of 99.7%, 100%, and added Labnoise (Bluetooth, Wi-Fi, Amplifier) and Gaussian
99.98%, respectively, to detect the presence of a drone. There noise to the dataset and mixed it with the drone signals with
is therefore an obvious need for a harder, more realistic SNR ∈ [−20, 30] dB. Using IQ data and spectrogram data
dataset. to train different CNNs, we found an advantage in favour
Consequently, [10] investigate the detection and classifica- of the 2D spectrogram representation of the data. There
tion of drones in the presence of Bluetooth and Wi-Fi signals. was no performance difference at SNR ≥ 0 dB but a major
Their system used a multi-stage detector to distinguish improvement in the balanced accuracy at low SNR levels,
drone signals from the background noise and interfering i.e. 84.2% on the spectrogram data compared to 41.3% on
signals. Once a signal was identified as a drone signal, it the IQ data at −12 dB SNR.
was classified using machine learning (ML) techniques. The Recently, [21] proposed an anchor-free object detector
detection performance of the proposed system was evalu- based on keypoints for drone RF signal spectograms. They
ated for different SNRs. The corresponding recordings (17 also proposed an adversarial learning-based data adaptation
drone controls from eight different manufacturers) are openly method to generate domain independent and domain aligned
available [11]. Unfortunately, the Bluetooth/Wi-Fi noise is features. Given five different types of drones, they report a
not part of the dataset. Ozturk et al. [12] used the dataset to mean average precision of 97.36%, which drops to ≈ 55%
further investigate the classification of RF fingerprints at low when adding Gaussian noise with −25 dB SNR. The raw
SNRs by adding white Gaussian noise to the raw data. Using data used in their work is available2 , but yet, unfortunately
a CNN, they achieved classification accuracies ranging from not usable without any further documentation.
92% to 100% for SNR ∈ [−10, 30]dB.
The openly available DroneDetect dataset [13] was created
by Swinney and Woods [14]. It contains raw in-phase and
quadrature (IQ) data recorded with a BladeRF SDR. Seven
drone models were recorded in three different flight modes 1 https://www.kaggle.com/datasets/sgluege/
2 VOLUME ,
<Society logo(s) and publication title will appear here.>
B. MOTIVATION
As we have seen in other fields, such as computer vision, the
success of DL can be attributed to: (a) high-capacity models;
(b) increased computational power; and (c) the availability
of large amounts of labelled data [22]. Thus, given the
large amount of available raw RF signals (cf. Tab. 1) we
promote the idea of open and reusable data, to facilitate
model development and model comparison.
With the noisy drone RF signals dataset [20], we have
provided a first ready-to-use dataset to enable rapid model
development, without the need for any data preparation.
Furthermore, the dataset contains samples that can be con-
sidered as “hard” in terms of noise, i.e. Bluetooth + Wi-Fi FIGURE 1: Recording of drone signals in the anechoic
+ Gaussian noise at very low SNRs, and allows a direct chamber. A DJI Phantom 4 Pro drone with the DJI Phantom
comparison with the published results. GL300F remote control.
While the models proposed in [20] performed reasonably
well in the training/lab setting, we found it difficult to trans-
fer their performance to practical application. The reason was acquisition process again to provide a complete picture of
the choice of rather short signal vectors of 16384 samples, the development from the raw RF signal to the deployment
corresponding to ≈ 1.2 ms at 14 MHz. Since the drone of a detection system within a single manuscript.
signals occur in short bursts of ≈ 1.3–2 ms with a repetition
period of ≈ 60–600 ms, our continuously running classifier
predicts a drone whenever a burst occurs and noise during A. DATA ACQUISITION
the repetition period of the signal. Therefore, in order to The drone’s remote control and, if present, the drone itself
provide a stable and reliable classification per every second, were placed in an anechoic chamber to record the raw RF
one would need an additional “layer” to pool the classifier signal without interference for at least one minute. The
outputs given every 1.2 ms. signals were received by a log-periodic antenna and sampled
In the present work, we follow a data-centric approach and and stored by an Ettus Research USRP B210, see Fig. 1. In
simply increase the length of the input signal to ≈ 75 ms to the static measurement, the respective signals of the remote
train a classifier in an end-to-end manner. Again, we provide control (TX) alone or with the drone (RX) were measured. In
the data used for model development in the hope that it will the dynamic measurement, one person at a time was inside
inspire others to develop better models. the anechoic chamber and operated the remote control (TX)
In the next section, we briefly describe the data collec- to generate a signal that is as close to reality as possible. All
tion and preprocessing procedure. Section III describes the signals were recorded at a sampling frequency of 56 MHz
model architectures and their training/validation method. In (highest possible real-time bandwidth). All drone models
addition, we describe the setup of a low-cost drone detection and recording parameters are listed in Tab. 2, including both
system and of the field test. The resulting performance uplink and downlink signals.
metrics are presented in Section IV and are further discussed We also recorded three types of noise and interference.
in Section V. First, Bluetooth/Wi-Fi noise was recorded using the hard-
ware setup described above. Measurements were taken in a
public and busy university building. In this open recording
II. MATERIALS setup, we had no control over the exact number or types
We used the raw RF signals from the drones that were of active Bluetooth/Wi-Fi devices and the actual traffic in
collected in [20]. Nevertheless, we briefly describe the data progress.
VOLUME , 3
Glüge et al.:
TABLE 2: Transmitters and receivers recorded in the development dataset and their respective class labels. Additionally,
we show the center frequency (GHz), the channel spacing (MHz), the burst duration (ms), and the repetition period of the
respective signals (ms).
Transmitter Receiver Label Center Freq. (GHz) Spacing (MHz) Duration (ms) Repetition (ms)
DJI Phantom GL300F DJI Phantom 4 Pro DJI 2.44175 1.7 2.18 630
Futaba T7C - FutabaT7 2.44175 2 1.7 288
Futaba T14SG Futaba R7008SB FutabeT14 2.44175 3.1 1.4 330
Graupner mx-16 Graupner GR-16 Graupner 2.44175 1 1.9/3.7 750
Bluetooth/Wi-Fi Noise - Noise 2.44175
Taranis ACCST X8R Receiver Taranis 2.440 1.5 3.1/4.4 420
Turnigy 9X - Turnigy 2.445 2 1.3 61, 120-2900 a
a The repetition period of the Turnigy transmitter is not static. First bursts were observed after 61 ms, the following signal bursts were observed in the
interval [120, 2900] ms
4 VOLUME ,
<Society logo(s) and publication title will appear here.>
VOLUME , 5
Glüge et al.:
FIGURE 2: Log power spectrogram and IQ data samples from the development dataset at different SNRs (a-d)
Mobile computer
PyTorch GPU
Classification
IQ Data
2.4-2.48 GHz Results
USRP USB3
GNU Radio GUI
B210
mitters, and at 180◦ – in the opposite direction. Directing the FIGURE 4: Detection prototype at the Zurich Lake in
antenna in the opposite direction should result in ≈ 20 dB Rapperswil.
attenuation of the radio signals.
Table 4 lists the drones and/or remote controls used in the
field test. Note that the Graupner drone and remote control
are part of the development dataset (cf. Tab. 2), but were not
TABLE 4: Drones and/or remotes used in the field test
measured in the field experiment. We assume that no other
drones were present during the measurements, so recordings Class Drone/remote control
where none of our transmitters were used are labelled as DJI DJI Phantom Pro 4 drone and remote
“Noise”. FutabaT14 Futaba T14 remote control
For each transmitter, distance, and angle, 20 to 30 s, or FutabaT7 Futaba T14 remote control
approximately 300 spectrograms were live classified and Taranis FrySky Taranis Q X7 remote control
recorded. The resulting number of samples for each class, Turnigy Turnigy Evolution remote control
distance, and angle are shown in Tab. 5.
6 VOLUME ,
<Society logo(s) and publication title will appear here.>
VOLUME , 7
Glüge et al.:
8 VOLUME ,
<Society logo(s) and publication title will appear here.>
VOLUME , 9
Glüge et al.:
the current approach will still detect a drone if one or more [12] E. Ozturk, F. Erden, and I. Guvenc, “Rf-based low-snr classification
are present. of uavs using convolutional neural networks,” ITU Journal on Future
and Evolving Technologies, vol. 2, pp. 39–52, 7 2021. [Online].
We have only tested a limited set of VGG architectures. It Available: https://www.itu.int/pub/S-JNL-VOL2.ISSUE5-2021-A04
remains to be seen whether more recent architectures, such [13] C. J. Swinney and J. C. Woods, “Dronedetect dataset: A
as the pre-trained Vision Transformer [32], generalise as well radio frequency dataset of unmanned aerial system (uas) signals
for machine learning detection & classification,” 2021. [Online].
or better. We hope that our development dataset will inspire Available: https://dx.doi.org/10.21227/5jjj-1m32
others to further optimise the model side of the problem and [14] ——, “Rf detection and classification of unmanned aerial vehicles
perhaps find a model architecture with better performance. in environments with wireless interference,” in 2021 International
Conference on Unmanned Aircraft Systems (ICUAS), 2021, pp. 1494–
Another issue to consider is the occurrence of unknown 1498.
drones, i.e. drones that are not part of the train set. Ex- [15] S. Kunze and B. Saha, “Drone classification with a convolutional
amining the embedding space (cf. B) gives a first idea of neural network applied to raw iq data,” in 2022 3rd URSI Atlantic and
Asia Pacific Radio Science Meeting (AT-AP-RASC), May 2022, pp. 1–
whether a signal is clearly part of a known dense drone 4. [Online]. Available: https://ieeexplore.ieee.org/document/9814170/
cluster or rather falls into the larger, less dense, noise [16] O. Medaiyese, M. Ezuma, A. Lauf, and A. Adeniran, “Cardinal rf
cluster. We believe that a combination of an unsupervised (cardrf): An outdoor uav/uas/drone rf signals with bluetooth and
deep autoencoder approach [33], [34] with an additional wifi signals dataset,” 2022. [Online]. Available: https://dx.doi.org/10.
21227/1xp7-ge95
classification part (cf. [35]) would allow, first, to provide [17] O. O. Medaiyese, M. Ezuma, A. P. Lauf, and A. A. Adeniran,
a stable classification of known samples and, second, to “Hierarchical learning framework for uav detection and identification,”
indicate whether a sample is known or rather an anomaly. IEEE Journal of Radio Frequency Identification, vol. 6, pp. 176–188,
2022.
[18] O. O. Medaiyese, M. Ezuma, A. P. Lauf, and I. Guvenc,
REFERENCES “Wavelet transform analytics for rf-based uav detection and
identification system using machine learning,” Pervasive and Mobile
[1] N. Al-lQubaydhi, A. Alenezi, T. Alanazi, A. Senyor, N. Alanezi, Computing, vol. 82, p. 101569, 6 2022. [Online]. Available:
B. Alotaibi, M. Alotaibi, A. Razaque, and S. Hariri, “Deep https://linkinghub.elsevier.com/retrieve/pii/S1574119222000219
learning for unmanned aerial vehicles detection: A review,” Computer [19] F. N. Iandola, M. W. Moskewicz, K. Ashraf, S. Han, W. J. Dally,
Science Review, vol. 51, p. 100614, 2 2024. [Online]. Available: and K. Keutzer, “Squeezenet: Alexnet-level accuracy with 50x fewer
https://linkinghub.elsevier.com/retrieve/pii/S1574013723000813 parameters and <1mb model size,” CoRR, vol. abs/1602.07360, 2016.
[2] M. H. Rahman, M. A. S. Sejan, M. A. Aziz, R. Tabassum, J.-I. [Online]. Available: http://arxiv.org/abs/1602.07360
Baik, and H.-K. Song, “A comprehensive survey of unmanned [20] S. Glüge., M. Nyfeler., N. Ramagnano., C. Horn., and C. Schüpbach.,
aerial vehicles detection and classification using machine learning “Robust drone detection and classification from radio frequency sig-
approach: Challenges, solutions, and future directions,” Remote nals using convolutional neural networks,” in Proceedings of the 15th
Sensing, vol. 16, p. 879, 3 2024. [Online]. Available: https: International Joint Conference on Computational Intelligence - NCTA,
//www.mdpi.com/2072-4292/16/5/879 INSTICC. SciTePress, 2023, pp. 496–504.
[3] M. S. Allahham, M. F. Al-Sa’d, A. Al-Ali, A. Mohamed, [21] R. Zhao, T. Li, Y. Li, Y. Ruan, and R. Zhang, “Anchor-free multi-uav
T. Khattab, and A. Erbad, “Dronerf dataset: A dataset of detection and classification using spectrogram,” IEEE Internet of
drones for rf-based detection, classification and identification,” Things Journal, vol. 11, pp. 5259–5272, 2 2024. [Online]. Available:
Data in Brief, vol. 26, p. 104313, 10 2019. [Online]. Available: https://ieeexplore.ieee.org/document/10221859/
https://linkinghub.elsevier.com/retrieve/pii/S2352340919306675 [22] C. Sun, A. Shrivastava, S. Singh, and A. Gupta, “Revisiting unrea-
[4] M. F. Al-Sa’d, A. Al-Ali, A. Mohamed, T. Khattab, and A. Erbad, sonable effectiveness of data in deep learning era,” in 2017 IEEE
“Rf-based drone detection and identification using deep learning International Conference on Computer Vision (ICCV), 2017, pp. 843–
approaches: An initiative towards a large open source drone database,” 852.
Future Generation Computer Systems, vol. 100, pp. 86–97, 11 2019. [23] P. Virtanen, R. Gommers, T. E. Oliphant, M. Haberland, T. Reddy,
[5] C. J. Swinney and J. C. Woods, “Unmanned aerial vehicle flight mode D. Cournapeau, E. Burovski, P. Peterson, W. Weckesser, J. Bright, S. J.
classification using convolutional neural network and transfer learn- van der Walt, M. Brett, J. Wilson, K. J. Millman, N. Mayorov, A. R. J.
ing,” in 2020 16th International Computer Engineering Conference Nelson, E. Jones, R. Kern, E. Larson, C. J. Carey, İ. Polat, Y. Feng,
(ICENCO), 2020, pp. 83–87. E. W. Moore, J. VanderPlas, D. Laxalde, J. Perktold, R. Cimrman,
[6] Y. Zhang, “Rf-based drone detection using machine learning,” in 2021 I. Henriksen, E. A. Quintero, C. R. Harris, A. M. Archibald, A. H.
2nd International Conference on Computing and Data Science (CDS), Ribeiro, F. Pedregosa, P. van Mulbregt, and SciPy 1.0 Contributors,
2021, pp. 425–428. “SciPy 1.0: Fundamental Algorithms for Scientific Computing in
[7] C. Ge, S. Yang, W. Sun, Y. Luo, and C. Luo, “For rf signal-based Python,” Nature Methods, vol. 17, pp. 261–272, 2020.
uav states recognition, is pre-processing still important at the era of [24] K. Simonyan and A. Zisserman, “Very deep convolutional networks
deep learning?” in 2021 7th International Conference on Computer for large-scale image recognition,” in 3rd International Conference
and Communications (ICCC), 2021, pp. 2292–2296. on Learning Representations, ICLR 2015, San Diego, CA, USA, May
[8] Y. Xue, Y. Chang, Y. Zhang, J. Sun, Z. Ji, H. Li, Y. Peng, and 7-9, 2015, Conference Track Proceedings, Y. Bengio and Y. LeCun,
J. Zuo, “Uav signal recognition of heterogeneous integrated knn Eds., 2015. [Online]. Available: http://arxiv.org/abs/1409.1556
based on genetic algorithm,” Telecommunication Systems, vol. 85, [25] S. Ioffe and C. Szegedy, “Batch normalization: accelerating deep
pp. 591–599, 4 2024. [Online]. Available: https://link.springer.com/ network training by reducing internal covariate shift,” in Proceedings
10.1007/s11235-023-01099-x of the 32nd International Conference on International Conference on
[9] A. AlKhonaini, T. Sheltami, A. Mahmoud, and M. Imam, “Uav Machine Learning - Volume 37, ser. ICML’15. JMLR.org, 2015, p.
detection using reinforcement learning,” Sensors, vol. 24, no. 6, 2024. 448–456.
[Online]. Available: https://www.mdpi.com/1424-8220/24/6/1870 [26] A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan,
[10] M. Ezuma, F. Erden, C. K. Anjinappa, O. Ozdemir, and I. Guvenc, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga, A. Desmaison, A. Kopf,
“Detection and classification of uavs using rf fingerprints in the E. Yang, Z. DeVito, M. Raison, A. Tejani, S. Chilamkurthy, B. Steiner,
presence of wi-fi and bluetooth interference,” IEEE Open Journal L. Fang, J. Bai, and S. Chintala, “Pytorch: An imperative style, high-
of the Communications Society, vol. 1, pp. 60–76, 2020. [Online]. performance deep learning library,” in Advances in Neural Information
Available: https://ieeexplore.ieee.org/document/8913640/ Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer,
[11] ——, “Drone remote controller rf signal dataset,” 2020. [Online]. F. d'Alché-Buc, E. Fox, and R. Garnett, Eds. Curran Associates, Inc.,
Available: https://dx.doi.org/10.21227/ss99-8d56 2019, pp. 8024–8035.
10 VOLUME ,
<Society logo(s) and publication title will appear here.>
VOLUME , 11