Detection and Classification of UAVs Using RF Fingerprintsin The Presence of Wi-Fi and Bluetooth Interference
Detection and Classification of UAVs Using RF Fingerprintsin The Presence of Wi-Fi and Bluetooth Interference
Detection and Classification of UAVs Using RF Fingerprintsin The Presence of Wi-Fi and Bluetooth Interference
fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
1
Index Terms—Interference, machine learning, Markov models, RF fingerprinting, unmanned aerial vehicles (UAVs), UAV detection
and classification.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
2
interference source. Given that the detected RF signal is from achieve an average accuracy of 94.43% and 96.13%,
an interference source, the source class is identified as Wi-Fi respectively. However, in [1], DA and NN achieve an
or Bluetooth. On the other hand, if the detected signal is from average accuracy of 88.15% and 58.49%, respectively,
a UAV controller, the signal is transferred to the ML-based when used to classify 14 UAV controllers.
classification system to determine the make and model of the 3) We study the confusion that results when attempting to
UAV controller. In an earlier work [1], the authors proposed classify UAV controllers of the same make and model.
a system for detecting and classifying 14 different UAV con- This is important in digital forensic analysis and de-
trollers. The system design assumes the absence of interference tecting decoys in surveillance systems. To investigate
signals. However, this assumption is not always correct. The this confusion, we included two pairs of identical UAV
contributions of the current work are summarized as follows: controllers to a pool of 13 different UAV controllers. That
is, we capture control signals from 17 UAV controllers
1) The paper investigates the problem of detecting and and evaluate the ability of the proposed classification
classifying signals from UAV controllers in the presence system at different SNRs. For an SNR of 25 dB, kNN
of co-channel wireless interference. We consider inter- and RandF achieve accuracy of 95.53% and 95.18%,
ference from Wi-Fi and Bluetooth sources and describe respectively, when the three most significant RF features
a methodology to detect the UAVs. The interference are used. To the best of our knowledge, past studies on
detection ensures the proposed UAV detection system is UAV classification using RF techniques considers only
robust against false alarms and missed target detection. In a limited number of different make and model UAV
addition, in [1], we used two fixed thresholds, positioned controllers, often less than 10 [8], [9].
at ± 3σ, to transform the captured signal into three- The remainder of the paper is organized as follows:
state Markov models, where σ is the standard deviation Section II provides a brief overview of the related work.
of the noise signal in the environment. However, in the Section III describes the multistage detection system, and
current work, we use a single threshold to transform Section IV introduces the methodology to detect Wi-Fi and
the captured signal into two-state Markov models, which Bluetooth interference signals. Feature extraction and the RF
reduces overall complexity. We also define a procedure fingerprinting-based UAV classification system are explained
to determine the optimum threshold value based on the in Section V. The experimental setup and data capture tech-
available training data. It turns out that better detection nique is described in Section VI while the detection and
accuracy can be achieved when a single but properly classification results are presented in Section VII. The paper
selected threshold is used to generate the Markov models. is concluded in Section VIII.
At an SNR of 10 dB, the current work achieves a
detection accuracy of 99.8% using a threshold that is II. R ELATED W ORK
3.5 times the standard deviation. However, in [1], the
UAV detection and classification through RF signals can be
detection accuracy is 84% under the same SNR condition.
grouped into two major headings: RF physical layer features-
Besides, in the current study, we evaluate the detection
based and RF medium access control (MAC) layer features-
performance for different thresholds based on the false
based techniques. In general, these techniques use an RF sens-
alarm rate (FAR).
ing device to capture the RF communication signal between a
2) We introduce the concept of energy transient for the
UAV and its controller.
extraction of RF-based features and show how effective it
is for the classification of the UAV controller signals. The
energy transient is computed using the representation of A. RF Physical Layer Features-Based Techniques
the RF signals in energy-time-frequency domain. From Most of the techniques classified within this category rely on
the energy transient, 15 statistical features are extracted the physical layer characteristics of the RF transmission from
for the UAV classification. The performance of five a UAV to its controller (or vice versa), such as the amplitude
different ML algorithms are compared using the proposed envelope or the spectrum of the RF signal. These techniques
RF fingerprinting technique. In addition, we investigate are sometimes referred to as RF fingerprinting techniques
the neighborhood component analysis (NCA) as a prac- because they utilize the unique characteristics of the RF signals
tical algorithm for feature selection in the classification for the detection and classification of the UAVs. Experimental
problem. The classification results using the three most investigations show that most of the commercial UAVs have
significant features, selected by the NCA, are compared unique RF signatures which is due to the circuitry design and
with those when all the 15 RF features are used. We also modulation techniques employed. Therefore, RF fingerprints
evaluate the classification performance at different signal- extracted from the UAV or its remote controller signals can
to-noise ratios (SNRs). For an SNR of 25 dB, the results be used as a basis for the detection and classification of the
show that the k-nearest neighbor (kNN) and random UAVs.
forest (RandF) machine learning algorithms are the best In [10], RF fingerprints of the UAV’s wireless control
performing classifiers, achieving accuracy of 98.13% and signals are extracted by computing the amplitude envelope
97.73%, respectively, when the three most significant RF- of the signal. The dimensionality of the processed signal is
based features are used for the classification of 15 UAV reduced by performing principal component analysis (PCA),
controllers. In comparison, the kNN classifier achieves and the lower-dimensional data is fed into an auxiliary
an accuracy of 96.3% when used to classify 14 UAV classifier Wasserstein generative adversarial networks (AC-
controllers [1]. Furthermore, in the current work, for WGANs). The AC-WGANs achieves an overall classification
the case of 15 UAV controllers, DA and NN classifiers rate of 95% when four different types of UAVs are considered.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
3
Captured
waveform
UAV remote
controller
Fig. 1. The scenario of the RF-based UAV detection system. The passive RF surveillance system listens for the signal transmitted between the controller and
the UAV. The environment contains signals from Wi-Fi and Bluetooth interference devices which operate in the same frequency band with the UAV and its
remote controller.
In [11], drones are detected by analyzing the RF background In general, a major concern with the Wi-Fi fingerprinting
activities along with the RF signals emitted when the drones techniques is the privacy. This is because the same Wi-
are operated in different modes. Afterward, RF spectrum of the Fi detection system can spoof Wi-Fi traffic data from a
drone signal is computed using the discrete Fourier transform smartphone user or a private Wi-Fi network. In addition, only
(DFT). The drone classification system is designed by training a limited number of commercial drones employ Wi-Fi links
a deep neural network with the RF spectrum data of different for video streaming and control. Most commercial drones use
drones. The system shows an accuracy of 99.7% when two proprietary communication links.
drones are classified, 84.5% with four drones, and 46.8% with Besides RF and Wi-Fi fingerprinting techniques, several
ten drones. other techniques have been investigated for UAV detection, in-
In [12], an industry integrated counter-drone solution is cluding radar-based techniques, acoustic techniques, and com-
described. The solution is based on a network of distributed puter vision techniques [14]. However, as discussed in [14],
RF sensors. In this system, RF signals from different UAV traditional radar systems are not so effective in detecting UAVs
controllers are detected using an energy detector. Afterward, with small radar cross sections, and acoustic and computer
the signals of interest are classified using RF spectral shape vision-based techniques are greatly impaired by ambient en-
correlation features. Besides, distributed RF sensors make it vironmental conditions. In contrast, RF techniques are not
possible to localize the UAV controller using time difference limited by these problems. We start by describing the design
of arrival (TDoA) or multilateration techniques. However, this of the multistage detector of our proposed RF-based system.
industrial solution is quite expensive.
III. M ULTISTAGE UAV S IGNAL D ETECTION
We consider the scenario shown in Fig. 1, where a pas-
B. RF MAC Layer Features-Based Techniques sive RF surveillance system listens for the control signals
There are many UAVs that use Wi-Fi protocol for video transmitted between a UAV and its remote controller. The
streaming and control. The techniques categorized under this main hardware components of the surveillance system are
heading use MAC layer features, such as packet statistics, 2.4 GHz RF antenna and a high-frequency oscilloscope, which
for detection and classification of the Wi-Fi controlled UAVs. is capable of sampling the captured data at 20 GSa/s. Instead
These techniques are sometimes referred to as Wi-Fi finger- of an oscilloscope, a standard software-defined radio like the
printing techniques. Thus, the RF detection system consists universal software radio peripheral (USRP) can also be used
primarily of a Wi-Fi packet-sniffing device, which can inter- for data capture. In order to avoid aliasing, the data capture
cept the Wi-Fi data traffic between a UAV and its remote device should be able to sample the captured data above the
controller. In [9], unauthorized Wi-Fi controlled UAVs are Nyquist rate. In this study, since we are interested in capturing
detected by a patrolling drone using a set of Wi-Fi statistical RF data in the 2.4 GHz band, the data capture device should
features. The extracted features include MAC addresses, root- be able to sample at a rate of at least 5 MSa/s. Besides, if the
mean-square (RMS) of the Wi-Fi packet length, packet dura- RF surveillance system is passive as described in Fig. 1, then
tion, average packet inter-arrival time, among others. These it increases the stealth attribute of the detection system. This
features are used to train different ML algorithms which implies, the system shown in Fig. 1 can detect an adversary
perform the UAV classification task. In [9], the random tree UAV while itself remaining undetected by the UAV. This
and random forest classifiers achieve the best performance as stealth attribute is vital in electronic warfare environments
measured by the true positive and false positive rates. where low probability of intercept (LPI) emitters are very
In [13], drone presence is detected by eavesdropping on valuable. Furthermore, the passive RF detection system has
Wi-Fi channels between the drone and its controller. The an advantage over a radar system in terms of the maximum
system detects drones by analyzing the impact of their unique detection range. This is because why a radar would have to
vibration and body shifting motions on the Wi-Fi signals transmit pulses and listen for the backscattering (echo) from
transmitted by the drone. The system achieves accuracy above the target, the passive RF detector only needs to listen for the
90% at 50 meters. signals from the target.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
4
Amplitude (Volts)
Amplitude (Volts)
Amplitude (Volts)
Amplitude (Volts)
0.1 0.1 0.1 0.1
0 0 0 0
Amplitude (Volts)
Amplitude (Volts)
Amplitude (Volts)
Amplitude (Volts)
0.1 0.1 0.1 0.1
0 0 0 0
Amplitude (Volts)
Amplitude (Volts)
Amplitude (Volts)
0.1 0.1 0.1 0.1
0 0 0 0
0 0.5 1 1.5 2 2.5 0 0.5 1 1.5 2 2.5 0 0.5 1 1.5 2 2.5 0 0.5 1 1.5 2 2.5
Time (s) 10-4 Time (s) 10-4 Time (s) 10-4 Time (s) 10-4
Fig. 2. RF signals captured from eight different UAV controllers and four different UAVs while on flight: (a) Graupner MC-32, (b) Spektrum DX6e, (c)
Futaba T8FG, (d) DJI Phantom 4 Pro, (e) DJI Inspire 1 Pro, (f) JR X9303, (g) Jeti Duplex DC-16, (h) FlySky FS-T6, (i) DJI Matrice 600 UAV, (j) DJI
Phantom 4 Pro UAV, (k) DJI Inspire 1 Pro UAV, (l) DJI Mavic Pro.
Since most commercial UAVs operate in the 2.4 GHz band, can be exploited for identifying the source UAV controller. The
the passive RF surveillance system is designed to operate flowchart in Fig. 3 provides a high-level graphical description
in this frequency band. However, this also corresponds to of the entire system. The first step in detecting and identifying
the operational band of Wi-Fi and mobile Bluetooth devices. a UAV controller is data capture. Usually, the captured raw sig-
Therefore, in real wireless environment, signals from these nal has a large size and is often very noisy. Therefore, before
wireless sources will act as interference to the detection of detection and classification, the signals are first pre-processed
the UAV control signals. Also, in such real environment, the using multiresolution analysis. Next, the processed signals are
presence of noise may further reduce the chance of correctly transferred to the multistage detection system, which consists
detecting the UAV signals when present. of two stages. In the first stage, the detector employs naïve
Given the scenario in Fig. 1, the passive RF surveillance Bayesian hypothesis test in deciding if the captured signal is an
system has to decide if the captured data comes from a RF signal or noise. If the decision is positive, the second stage
UAV controller, an interference source, or background noise. detector is activated to decide if the captured RF signal comes
In the case where the captured data comes from a UAV from an interference source or a UAV controller. This detector
controller, the detection system should be able to correctly uses bandwidth analysis and modulation-based features for
classify the UAV controller. However, if the detected signal interference detection. If the detected RF signal is not from a
is from an interference source, the detection system should Wi-Fi or Bluetooth interference source, it is presumed to be
be able to correctly identify the source, i.e., a Wi-Fi or a signal transmitted by a UAV controller. Consequently, the
a Bluetooth device. Therefore, the detection problem is detected signal is transferred to an ML-based classification
a multi-hypothesis problem. For such problems, it is well system for accurate identification of the UAV controller.
known that computational complexity increases as the number
of hypothesis increases. Consequently, the multi-hypothesis
detection problem can be simplified by using a multistage A. Pre-processing Step: Multiresolution Analysis
sequential detector. In this system, each detection stage is a Captured RF data are pre-processed by means of wavelet-
simple binary hypothesis test which is much easier to solve. based multiresolution analysis. It has been established that
Fig. 2 illustrates sample RF signals captured from eight dif- multiresolution decomposition using discrete wavelet trans-
ferent UAV controllers and four different UAVs (on flight). The form (DWT) like the Haar wavelet transform is effective for
figure shows each signal has different characteristics, which analyzing the information content of signals and images [15].
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
5
4
104 4
10
4
Capture
Data
2 2
y [n]
0 0
-2 -2
No
-4 -4
Multiresolution yT [n] Yes
Naïve Bayesian Signal 0 1 2 3 4 5 0 0.2 0.4 0.6 0.8 1 1.2
Analysis Detector Detected ? 106 10
6
(a) (b)
Yes
Fig. 5. (a) The sampled raw data y[n] captured from the remote controller
Compute
Bandwith No
Modulation
Bluetooth of a DJI Phantom 4 Pro UAV using an oscilloscope with a sampling rate
≥ 20 MHz Signal ? of 20 GSa/s, and (b) the transformed data yT [n] obtained at the output of
Features
Yes the two-level Haar wavelet filter. Due to successive downsampling, yT [n] has
Yes No about 3.8 × 106 fewer data samples than y[n].
Fig. 3. The system flowchart providing a graphical description of information B. Naïve Bayes Decision Mechanism for RF Signal Detec-
processing and flow of data through the system. tion
In this stage, we first model the pre-processed RF data,
Input
g [n] 2
d1 [n] yT [n], using two-state Markov models for “RF signal" and
y [n] d2 [n] Output
g [n] 2 “noise" classes. This allows us to compute the likelihood
a1 [n]
yT [n]
h [n] 2 that the captured data come from either the signal or noise
a2 [n] class. According to the Bayesian decision theory, the optimum
h [n] 2
detector is the one that maximizes the posterior probability.
Fig. 4. The two-level discrete Haar wavelet transform for pre-processing of Mathematically, let C ∈ {0, 1} be an index denoting the class
the captured raw data. of the pre-processed RF data yT [n], where C = 1 when the
captured raw signal y[n] is an RF signal, and C = 0 otherwise.
Let SyT = [SyT (1), SyT (2), . . . , SyT (N )]> be the state vector
In this work, multiresolution decomposition of the captured representation of the given test data yT [n] containing N
RF data are carried out using the two-level Haar transform as samples, with SyT (i) ∈ {S1 , S2 }, i = 1, 2, .., N , and S1 and S2
shown in Fig. 4. Using this transform, the raw input signal being the two states in the Markov models. Then, the posterior
is decomposed into subbands, and important time-frequency probability of the RF signal class given SyT is
information can be extracted at different resolution levels [16]. P (SyT |C = 1)P (C = 1)
In the first level, the input RF data are split into low- and high- P (C = 1|SyT ) = , (1)
P (SyT )
frequency components by means of the half band low-pass
(h[n]) and high-pass (g[n]) filters, respectively. This process is where P (SyT |C = 1) is the likelihood function conditioned
followed by a dyadic decimation, or downsampling, of the fil- on C = 1, P (C = 1) is the prior probability of the RF
ter outputs to produce the approximate coefficients, a1 [n], and signal class, and P (SyT ) is the evidence. A similar expression
detail coefficients, d1 [n]. In the second level, a1 [n] coefficients holds for the posterior probability P (C = 0|SyT ). In practice,
are further decomposed in a similar manner, and the generated since the evidence is not a function of C, it can be ignored.
d2 [n] coefficients are taken as the final output (yT [n]). Then, Therefore, we are only interested in maximizing the numerator
yT [n] is input to the multistage detection system. Moving from in (1). That is,
left to right in Fig. 4, we get coarser representation of the b = arg max P (Sy |C)P (C).
captured RF data. The output RF data will have fewer samples C T (2)
C
due to the successive downsampling of the input RF data. This We decide that the captured signal belongs to an RF signal
reduces the computational complexity of the overall process. (i.e., C = 1), if
Multiresolution analysis is also useful in detecting weak
signals in the presence of background noise and removing P (SyT |C = 1)P (C = 1) ≥ P (SyT |C = 0)P (C = 0). (3)
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
P11=0.0338 P12=0.0245 P22=0.9171 6
S1 S2
For the detection experiment, we collected an equal number P11=0.0338 P12=0.0245 P22=0.9171
of RF and noise signals. Therefore, it is rational to assume P21=0.0245
the prior probabilities of the RF signal and noise classes are S1 S2
equal, then the decision rule in (3) reduces to Signal Class
P21=0.0245
P (SyT |C = 1) ≥ P (SyT |C = 0). (4)
(a)
Therefore, for a given test data, we need to compute and Signal Class
P11=0.544 P12=0.1870 P22=0.0816
compare the likelihood probabilities P (SyT |C = {0, 1}). First,
in order to compute the likelihood probability for the RF
S1 S2
signal and noise classes, we use large amount of training data P12=0.1870
captured from multiple UAV controllers, Wi-Fi routers, mo- P11=0.544 P22=0.0816
bile Bluetooth emitters, and background noise. This training P21=0.1870
S1 S2
data set is stored in a database as shown in Fig. 3. Since (b)
Noise Class
the captured RF data (after sampling) is a discrete time- Fig. 6. Two-state Markov model and associated state transition probabilities
P21=0.1870
varying waveform, we can model it as a stochastic sequence using δ = 3.5σ for (a) the RF signal class, and (b) the noise class.
of states/events. The likelihood probability of such a state Noise Class
sequence can be computed based on the transitions between
the states of the generated Markov models. is significantly higher than the other transition probabilities.
A two-state Markov model for a given signal yT [n] can Based on these observations, the differences between the
be generated by mapping each sample in the signal to one state transition probabilities of each class can be utilized to
of the two states (S1 and S2 ). The samples whose absolute determine the class of a captured test signal.
amplitudes are less than or equal to a predetermined threshold Consequently, the likelihood of the test signal being an RF
δ are considered as in state S1 , while the samples with signal can be calculated as follows:
absolute amplitude greater than δ are considered as in state N −1
S2 . Mathematically, the state transformation is performed as Y
P (SyT |C = 1) = p(SyT (n) → SyT (n + 1)|C = 1)
follows: ( n=1
S1 , |yT [n]| ≤ δ Y T (i,j)
SyT (n) = . (5) = N
T PC=1 (i, j) (7)
S2 , |yT [n]| > δ
i,j={1,2}
Based on the above rule, it is straightforward to transform =
Y N
ij
pij;C=1 .
yT [n] into the state vector, SyT . Once SyT is obtained, the
i,j={1,2}
probability of a transition between any two states is calculated.
Note that the state vector is generated based on the amplitude The product of the conditional transition probabilities in the
of the signal samples in the wavelet domain. The choice of δ above equation gives the likelihood of obtaining the state
in (5) depends on the operating SNR of the system and will vector SyT given the hypothesis C = 1 is true. The log-
be discussed in Section VII-A. The transition count matrix, likelihood of the above expression is:
T N , and the transition probability matrix, T P , are defined as X
follows: log (P (SyT |C = 1)) = Nij log(pij;C=1 ). (8)
i,j={1,2}
N11 N12 p11 p12 TN
TN = ,TP = =P , (6) Similarly, the log-likelihood of the signal coming from a noise
N21 N22 p21 p22 i,j Nij
class is calculated by
respectively, where Nij is the number of transitions from state X
Si to Sj among all samples of yT [n], and pij = P (Si → Sj ) log (P (SyT |C = 0)) = Nij log(pij;C=0 ). (9)
is the probability of a transition from state Si to Sj . The i,j={1,2}
matrix T P is obtained by normalizing the T N matrix with
The decision will be favored to C = 1, if
the total number of samples in the signal. It is expected that
log(P (SyT |C = 1)) ≥ log(P (SyT |C = 0)); otherwise,
the transition probabilities generated for the signal class (UAV,
C = 0. We discuss the detection results in Section VII-A. If
Wi-Fi, and Bluetooth) and the noise class will be significantly
the captured test signal belongs to the RF signal class, then the
different at modest SNR levels. Also, the choice of δ in (5)
second stage detector is invoked to identify UAV controller-
dictates the transition probabilities for both the signal and
type signals. Otherwise, the system continues sensing the
noise class. In Section VII-A, the threshold δ is expressed
environment for the presence of signals as shown in Fig. 3.
in terms of the estimated standard deviation (σ) of the prepro-
cessed noise data captured from the environment. Moreover,
during the experiments, data is captured within a short time IV. D ETECTION OF W I -F I A ND B LUETOOTH
window (0.25 ms), thus we assume the environmental noise I NTERFERENCE
is stationary during this interval. In recent times, there has been interest in detecting Wi-
Fig. 6 shows the two-state Markov models for the RF Fi and Bluetooth signals [17]. In [18], a new technique is
signal and noise classes obtained from the training data using proposed for classifying Wi-Fi and Bluetooth interference
δ = 3.5σ. From Fig. 6(a), we see that for the signal class, p22 signals in the 2.4 GHz band. The technique uses the Hidden
is significantly higher than p11 , p12 , and p21 . On the other Markov Model (HMM) to model sequences or periodicity
hand, from Fig. 6(b), we see that for the noise class, p11 in the captured signal. The expectation-maximization (EM)
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
7
10 2 2
8
1.5 1.5
FFT
FFT
FFT
6
1 1
4
0.5 0.5
2
0 0 0
-100 -80 -60 -40 -20 0 20 40 60 80 100 -3 -2 -1 0 1 2 3 -3 -2 -1 0 1 2 3
Frequency (MHz) Frequency (MHz) Frequency (MHz)
Fig. 7. Bandwidth analysis of (a) Wi-Fi signal, (b) Bluetooth signal from Motorola e5 cruise, and (c) Spektrum DX5e UAV controller signal.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
8
3
1.5
FFT
FFT
0 2.4 GHz
2
1
-2 1 0.5
-4 0 0
0 0.02 0.04 0.06 0.08 0.1 -5 -4 -3 -2 -1 0 1 2 3 4 5 -3 -2 -1 0 1 2 3
Time (ms) Frequency (GHz) Frequency (MHz)
106 30
1
2 25
Frequency of occurrence
Start-point 0.8
Frequency (Hz)
1 20 Symbol
Sample value
0.2 5
-2
-3 0 0
0.02 0.04 0.06 0.08 0.1 0 0.01 0.02 0.03 0.04 0.5 1.0 1.5 2.0 2.5 3.0
Time (ms) Time (ms) Zero-crossing interval ( sec)
Fig. 8. Extraction of the modulation features of a Bluetooth interference signal from Motorola e5 cruise mobile device using zero-crossing demodulation
technique: (a) Raw signal, (b) FFT of the raw signal, (c) FFT of the shifted and resampled signal (by 1/2000), (d) the demodulated signal showing a
peak-to-peak frequency of 551.12 kHz, (e) binary signal, and (f) histogram of the time-interval between consecutive zero-crossings in the modulated signal.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
9
TABLE II
140 0
S TATISTICAL F EATURES .
120 Features Formula Measures
Power/frequency (dB/Hz)
Mean (µ) 1 PN Central tendency
N Pi=1 i
x
100
-50 Absolute mean (x̄) 1 N Central tendency
N i=1 |xi |
80
h i1
Standard 1 PN 2 2 Dispersion
i=1 (xi − x̄)
2.4 GHz
deviation(σT ) N −1
60 PN 3 Asymmetry/shape
Skewness (γ) i=1 (xi −x̄)
-100
(N −1)σT 3 descriptor
40 PN
Entropy (H) − i=1 xi log2 xi Uncertainty
h P i1 Magnitude/Average
20 Root mean square 1 N 2 2
(xrms ) N i=1 xi power
-150 h P 1 2
i
Root (xr ) 1 N Magnitude
i=1 |xi |
0 1 2 3 4 5 2
N
Frequency (GHz) PN 4 Tail/shape
Kurtosis (k) i=1 (xi −x̄)
(N −1)σT 4 descriptor
(a) PN
1
Variance N i=1 (xi − µ)2 Dispersion
Peak value (xpv ) max(xi ) Amplitude
Waveform ampli-
4 Peak to peak (xppv ) max(xi ) − min(xi )
tude
xrms
Normalized energy trajectory
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
10
0.1
Make Model Make Model
Inspire 1 Pro DX5e
0.08 Matrice 100 DX6e
DJI Matrice 6001 Spektrum DX6i
0.06 Phantom 4 Pro1 JR X9303
Phantom 3
0.04
Futaba T8FG Graupner MC-32
0.02
HobbyKing HK-T6A FlySky FS-T6
Turnigy 9X Jeti Duplex DC-16
0
or
an R r
da ak2 ot
vi k
e sis
Sh Ku ce
ss
n
ew or
S
py
F r
Fa k
Va tion
n
st cto
de ea
ls ea
a
ea
ce M
an Pe Ro
ct
ct
Sk act
ne
n
tro
Ab Me
ap rto
rd P
ria
pu P
Fa
C Fa
M
En
re
le
C
St
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
11
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
12
100
90
[0.1 ,0.5 ]
80
[3.1 ,4.1 ]
70
Detection Accuracy (%)
TABLE IV
P ERFORMANCE OF THE ML C LASSIFICATION A LGORITHMS AT 25 dB SNR. 100 S AMPLE
S IGNALS FROM E ACH UAV C ONTROLLER I S C APTURED WITH 80% U SED FOR T RAINING
AND 20% FOR T ESTING (PARTITION R ATIO = 0.2). T HE S ELECTED RF F INGERPRINTS
A RE : S HAPE FACTOR , K URTOSIS , AND VARIANCE .
# of controllers Classifier Accuracy (%)2 Computational Time (s)2
All Feat. Selected Feat. All Feat. Selected Feat.
kNN 97.30 98.13 24.85 24.57
DA 96.30 94.43 19.42 18.58
15 SVM 96.47 91.67 119.22 111.02
NN 96.73 96.13 38.73 38.14
RandF 98.53 97.73 21.37 20.89
kNN 95.62 95.53 26.16 25.13
DA 92.77 88.12 19.36 18.90
17 SVM 93.82 87.88 139.94 141.68
NN 92.88 93.03 46.04 43.33
RandF 96.32 95.18 24.71 24.84
ranking shown in Fig. 11. These are the shape factor, kurtosis to train this model. More details can be found in [25],
and variance. The classification experiments are run separately [26]. Some of the critical hyper-parameters for the machine
for the case of 15 and 17 UAV controllers. Here, the number learning models (for the 17 controller case) after the Bayesian
of controllers represents the number of classes considered. In hyperparameter optimization are listed below:
the case of 15 controllers, all the controllers are of a different • kNN: Number of neighbors = 10, Distance metric =
model. However, in the case of 17 controllers, a pair of DJI mahalanobis
Matrice 600 (labeled as DJI Matrice 600 Mpact and DJI • DA: Type = Linear, Delta = 0.15146, Gamma =
Matrice 600 Ngat) and a pair DJI Phantom 4 Pro controllers 0.00016419
(labeled as DJI Phantom 4 Pro Mpact and DJI Phantom 4 Pro • SVM: Coding: onevsone, Lambda = 3.9941e-08, Learner
Ngat) are considered in addition to 13 different models. = Logistic
We used the Bayesian optimization method to obtain • NN: Double layer: Number of hidden nodes (Layer 1) =
the best hyper-parameters for our machine learning models. 45, Number of hidden nodes (Layer 2) = 15, Learning
Bayesian optimization has become a successful tool for hyper- rate = 0.30103, Activation functions = radbas
parameter optimization of machine learning algorithms, such • RandF: Bagged Ensemble with 60 bagged decision trees
as support vector machines or deep neural networks. The
algorithm internally maintains a Gaussian process model of 2 Both the accuracy and total computation time are the average of the 10
the objective function and uses objective function evaluations Monte Carlo simulations.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
13
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
14
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
Output class Output class Output class
D D D
JI D JI JI D
Sp H Ph J Ph DJ Ph J
an I M Sp H Sp H
Je e o G t a D Je an I M
D Je an I M
D
ti kt bb r S S S ti ek ob G
S S ti ek ob G
S S
D F ru y au F p pe pe DJ om tric DJI JI
k D tru by ra
F
Sp p p D tomatric DJ JI D tru by ra
F
Sp p p D tomatric DJ JI
D u p ly m i n e F l m k up u e ek ek JI 4 e I M In F l m k up u e ek ek JI 4 e I M In
JI T s pn ut k kt kt I P 4 P e 6 M Ins
le u k JR g e ab tru ru ru h r 0 at pi D up D up
D I x rn y a D
JI
I
le Tu ysk JRing ne tab ktru tru tru Ph Pr 60 at spi
x rn y a D
JI
I
le Tu ysk JRing ne tab ktru tru tru Ph Pr 60 at spi
x rn y a
D JI D nsp D ig F X9HK r M a T m m Dm D ntoo M0 M rice re 1 D JI D nsp D ig F X9HK r M a T m m Dm D ntoo M0 M rice re 1 D JI D nsp D ig F X9HK r M a T m m Dm D ntoo M0 M rice re 1
JI M JI ir C y S- 3 -T C 8 D X X m p p 1 P C y S- 3 -T C 8 D X X m p p 1 P C y S- 3 -T C 8 D X X m p p 1 P
-1 9 T 0 6 -3 FG X6 6 5 a a 0 JI M JI ir a a JI M JI ir a a
Ph at M e 6 X 6 3 A 2 i e e 3 ct ct 0 ro Ph at M e -1 9 T 0 6 -3 FG X6 6 5
6 X 6 3 A 2 e e 3 ct ct 00 ro Ph at M e -1 9 T 0 6 -3 FG X6 6 5
6 X 6 3 A 2 e e 3 ct ct 00 ro
an ric atr 1 P an ric atr 1 P i an ric atr 1 P i
to e 6 ice ro to e 6 ice ro to e 6 ice ro
m 0 1 m 0 1 m 0 1
4 0 0 4 0 0 4 0 0
D Pr Mp 0 D Pr Mp 0 D Pr Mp 0
J o a J o a J o a
Sp I Ph M ct Sp I Ph M ct Sp I Ph M ct
e a pa e a pa e a pa
Sp ktr nto ct Sp ktr nto ct Sp ktr nto ct
e um m e um m e um m
Sp ktru DX 3 Sp ktru DX 3 Sp ktru DX 3
Target class
tru g C tru g C
Target class
Target class
m HK -32 m HK -32 m HK -32
Fl JR -T6 Fl JR -T6 Fl JR -T6
ys X9 A ys X9 A ys X9 A
-1 -1 -1
6
1-
1-
1-
0
1
0
1
0
1
0.2
0.4
0.6
0.8
0.2
0.4
0.6
0.8
0.2
0.4
0.6
0.8
Target class
Target class
Target class
Je k 9 Je k 9
ti T y F 30
D u S 3
up rn -T
(d) RandF with 17 controllers ti T y F 30
D u S 3
ti T y F 30
D u S 3
le igy 6 up rn -T up rn -T
le igy 6 le igy 6
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
x 9 x 9 x 9
D X
DA, acc. = 88.12%, timeElapsed:18.90s
C D X
C
D X
C
kNN, acc. = 95.53%, timeElapsed:25.13s
-1 -1 -1
6
RandF, acc. = 95.18%, timeElapsed:24.84s
6 6
1-
1-
1-
0
1
0
1
0
1
0.2
0.4
0.6
0.8
0.2
0.4
0.6
0.8
0.2
0.4
0.6
0.8
matrices, the colorbar is used to specify the degree of confusion in terms of the confusion probability ρ. Moving down the colorbar, the degree of confusion
Fig. 16. Confusion matrices of kNN, RandF and DA classifiers using the three selected RF fingerprints (shape factor, kurtosis, and variance). In the confusion
15
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/OJCOMS.2019.2955889, IEEE Open
Journal of the Communications Society
16
[17] S. Rayanchu, A. Patro, and S. Banerjee, “Airshark: detecting non-wifi Tech. Rep., Oct. 2017.
RF devices using commodity WiFi hardware,” in Proc. ACM Internet [22] R. Esteller, G. Vachtsevanos, J. Echauz, and B. Litt, “A comparison of
Meas. Conf. (ACM IMC), Berlin, Germany, Nov. 2011, pp. 137–154. waveform fractal dimension algorithms,” IEEE Trans. Circuits Syst. I.
[18] Z. Weng, P. Orlik, and K. J. Kim, “Classification of wireless interference Fundam. Theory Appl., vol. 48, no. 2, pp. 177–183, Feb. 2001.
on 2.4 ghz spectrum,” in Proc. IEEE Wireless Communications and [23] J. Goldberger, G. E. Hinton, S. T. Roweis, and R. R. Salakhutdinov,
Networking Conference (WCNC), Istanbul, Turkey. IEEE, 2014, pp. “Neighbourhood components analysis,” in Proc. NeurIPS, Vancouver,
786–791. Canada, Dec. 2004, pp. 513–520.
[19] T. Scholand and P. Jung, “Bluetooth receiver with zero-crossing zero- [24] W. Yang, K. Wang, and W. Zuo, “Neighborhood component feature
forcing demodulation,” IET Electron. Lett., vol. 39, no. 17, pp. 1275– selection for high-dimensional data.” JCP, vol. 7, no. 1, pp. 161–168,
1277, Aug. 2003. Jan. 2012.
[20] N. Instrument, “Introduction to Bluetooth device testing: From the- [25] J. Snoek, H. Larochelle, and R. P. Adams, “Practical bayesian optimiza-
ory to transmitter and receiver measurements,” http://download.ni.com/ tion of machine learning algorithms,” in Advances in neural information
evaluation/rf/intro_to_bluetooth_test.pdf, Tech. Rep., Sept. 2016. processing systems, 2012, pp. 2951–2959.
[21] Tektronix, “Wi-Fi: Overview of the 802.11 physical layer and [26] A. Klein, S. Falkner, S. Bartels, P. Hennig, and F. Hutter, “Fast bayesian
transmitter measurements,” https://www.tek.com/document/primer/ optimization of machine learning hyperparameters on large datasets,”
wi-fi-overview-80211-physical-layer-and-transmitter-measurements, arXiv preprint arXiv:1605.07079, 2016.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.