An Improved Unauthorized Unmanned Aerial Vehicle Detection Algorithm Using Radiofrequency-Based Statistical Fingerprint Analysis
An Improved Unauthorized Unmanned Aerial Vehicle Detection Algorithm Using Radiofrequency-Based Statistical Fingerprint Analysis
Article
An Improved Unauthorized Unmanned Aerial Vehicle
Detection Algorithm Using Radiofrequency-Based
Statistical Fingerprint Analysis
Shengying Yang 1 , Huibin Qin 1, *, Xiaolin Liang 2 and Thomas Aaron Gulliver 3
1 Institute of Electron Device & Application, Hangzhou Dianzi University, Hangzhou 310018, China;
[email protected]
2 Science and Technology on Electronic Test & Measurement Laboratory, The 41st Research Institute of CETC,
Qingdao 266555, China; [email protected]
3 Department of Electrical Computer Engineering, University of Victoria, Victoria, BC V8W 2Y2, Canada;
[email protected]
* Correspondence: [email protected]; Tel.: +86-181-0017-3077
Received: 19 November 2018; Accepted: 8 January 2019; Published: 11 January 2019
Abstract: Unmanned aerial vehicles (UAVs) are now readily available worldwide and users can easily
fly them remotely using smart controllers. This has created the problem of keeping unauthorized
UAVs away from private or sensitive areas where they can be a personal or public threat. This paper
proposes an improved radio frequency (RF)-based method to detect UAVs. The clutter (interference)
is eliminated using a background filtering method. Then singular value decomposition (SVD) and
average filtering are used to reduce the noise and improve the signal to noise ratio (SNR). Spectrum
accumulation (SA) and statistical fingerprint analysis (SFA) are employed to provide two frequency
estimates. These estimates are used to determine if a UAV is present in the detection environment.
The data size is reduced using a region of interest (ROI), and this improves the system efficiency
and improves azimuth estimation accuracy. Detection results are obtained using real UAV RF
signals obtained experimentally which show that the proposed method is more effective than other
well-known detection algorithms. The recognition rate with this method is close to 100% within a
distance of 2.4 km and greater than 90% within a distance of 3 km. Further, multiple UAVs can be
detected accurately using the proposed method.
Keywords: spectrum sensing; radio frequency (RF); singular value decomposition (SVD); spectrum
accumulation (SA); statistical fingerprint analysis (SFA)
1. Introduction
The Internet of Things (IoT) has attracted significant attention worldwide due to the evolution of
pervasive smart environments which can connect people and things anytime, anywhere, and anyplace.
The IoT is suitable for many applications such as smart environments [1], remote sensing [2], public
security [3], smart traffic [4], health care [5], intelligent cities [6], emergency services [7], and industrial
control [8]. Unmanned aerial vehicles (UAVs) were first employed by the military [9–12]. Since then,
many civilian applications have been developed such as UAV photogrammetry which can provide
high spatial resolution data due to the low operational altitude [13–15]. The relatively low cost of
UAV platforms has led to their wide popularity in many fields. UAVs have been employed in ecologic
observation [16], vegetation monitoring [17], water quality monitoring [18], precision agriculture [19],
surveying [20], natural resource assessment [21], avalanche monitoring [22], forest inventories [23],
wildlife management [24], and coastal environment monitoring [25]. Further, they can complement or
replace traditional satellite-based methods [26].
The pervasive use of UAVs has led to technical and societal concerns related to security, privacy,
and public safety which must be addressed. For example, a UAV interrupted a US Open tennis match
and another crashed at the White House. Shooting down the UAV is the most direct solution, but this
is typically illegal. In [27], a genetic algorithm (GA) was employed to jam UAV signals. In [28], GPS
signals were jammed to disable a UAV. These methods can be improved if the UAV is first detected
and located. Several algorithms have been developed which can be used for UAV detection. These
algorithms are based on video, sound, radar, temperature, radio frequency (RF), and WiFi [29–34].
Active radar was used to detect UAVs in [29]. However, this method can only provide accurate
detection results over short distances. A UAV was detected using a calibrated radar cross section (RCS)
technique in [30], but the RCS must be known a priori and this can be difficult to acquire. In [31], UAV
detection was achieved with a complex-log Fourier transform method and spectrogram features [31].
However, data is needed over a long time period which makes this approach complex and slow.
Log-polar transformations and space-variant resolution were used in [32] for UAV detection but the
accuracy is poor over long distances. A UAV was located in [33] using an acoustic array at an effective
range of 300 m. A multiple input multiple output (MIMO) radar was used to detect and track a UAV
in [34]. However, this radar is complex and has poor detection performance due to low received power
and receiver sensitivity.
An artificial neural network (ANN) was used in [35] to detect a UAV by analyzing the RF signal
characteristics, but this approach cannot provide azimuth information. A passive radar using digital
audio broadcasting (DAB) was proposed in [36] to detect a UAV. However, this method can only
determine if a UAV is present over short distances and cannot provide both azimuth and range
estimates. Range-velocity processing was used in [37] with a MIMO radar in the Ka band for UAV
detection [37], but this system is only effective over distances of several hundred meters. Multiple
heterogeneous sensors were used to track and detect a UAV in [38]. However, significant time is
required for track extraction and multi-target detection. A distributed passive radar was developed
in [39] for UAV detection.
UAV detection techniques based on video have been developed. However, it is challenging to
distinguish a UAV from other flying objects [40]. In [41], an audio classification system using data
mining techniques was proposed for UAV detection, but this system can only work well in quiet
environments. In [42], UAV detection was considered using a radar sensor which employs characteristic
features, but this system was only evaluated in a laboratory environment. A temperature-based method
was proposed to detect a UAV in [43] which is only effective for fixed-wing devices. A method was
proposed to detect UAV communication signals in [44]. A vision-based algorithm was proposed in [45]
to detect UAVs and estimate their distance. However, this approach is limited to short distances less
than 25 m. In [46], a tracking algorithm was developed to detect a moving UAV. Image processing was
considered in [47] to identify multiple UAVs [47], but this technique is complex.
A typical UAV has a very small RCS which make radar-based detection algorithms unsuitable for
detection [48]. RF-based methods are more appropriate as they can achieve a higher processing gain.
In this paper, a spectrum sensing system is employed to detect a UAV. The contributions are as follows:
(1) Spectrum accumulation (SA) and statistical fingerprint analysis (SFA) techniques are used to
provide frequency estimates of RF signals. These estimates are used to determine if a UAV is
present in the detection environment.
(2) A region of interest (ROI) is defined to reduce the data size, improve the system efficiency and
provide accurate azimuth estimates.
(3) The performance of the proposed algorithm is compared with that using several well-known
techniques in the literature. Further, the ability to detect multiple UAVs with the proposed
algorithm is evaluated.
Sensors 2019, 19, 274 3 of 22
Sensors 2019, 19, x FOR PEER REVIEW 3 of 21
Theremainder
The remainderof ofthis
thispaper
paperisisorganized
organizedasasfollows:
follows: the
the system
system model
model isis given
given in
in Section
Section 2.
2.
Section33presents
Section presentsthe
theproposed
proposeddetection
detectionalgorithm
algorithmand
andthe
thedetection
detection results
results are
aregiven
given in
in Section
Section 4.
4.
Finally, Section 5 concludes the paper.
Finally, Section 5 concludes the paper.
2.2.System
SystemModel
Model
AAUAV
UAVcommunicates
communicateswith withthetheground
groundcontroller
controllerusing
usingRF RFsignals
signalsso sothese
thesesignals
signalscancanbebeused
used
for
fordetection.
detection.Orthogonal
Orthogonalfrequency
frequency division
division multiplexing
multiplexing (OFDM)
(OFDM) andandfrequency hopping
frequency hopping (FH)(FH)
are
employed [49], and
are employed [49],the
andRFthe
signals must meet
RF signals must themeet
119 ISM
the limitations [50]. The most
119 ISM limitations [50]. common
The mostfrequency
common
bands for these
frequency bandssignals are atsignals
for these 2.4 GHz and
are at 52.4
GHz.
GHzFigure
and 51 GHz.
illustrates
Figurethe1experimental
illustrates thesetup for UAV
experimental
detection.
setup forRF UAVsignals are acquired
detection. using the
RF signals arereceiver
acquired andusing
storedthe
in areceiver
wirelessand personal
storeddigital
in aassistant
wireless
(PDA).
personalA UAV signal
digital in the(PDA).
assistant 2.5 GHz A band
UAV issignal
shown inin the2.5
the figure.
GHz Thebandmain receiver
is shown in parameters
the figure. are The
given ◦ ◦ ◦
main in Table 1.parameters
receiver The antenna arerotates
given over an and
in Table angle
1. The in the rotates
antenna range 0over–180anat a speed
and angleof in22.5 /8 s.
the range
The received
0°–180° RF signals
at a speed are shown
of 22.5°/8 s. Theinreceived
Figure 1 RFas asignals
matrixare
where
shown x-axis
the in Figure denotes
1 as afrequency
matrix where f in the
the
range 2.4 GHz–2.5 GHz and the y-axis denotes the RF signal power y(f ) in dBm.
x-axis denotes frequency f in the range 2.4 GHz–2.5 GHz and the y-axis denotes the RF signal power This is an M × N
matrix
y(f) in A where
dBm. M is
This denotes
an M ×the N number
matrix A ofwhere
frequency values the
M denotes N denotes
andnumber the numbervalues
of frequency of azimuth
and N
angle values.
denotes the The bandwidth
number of the UAV
of azimuth angleRF values.
signals isThe
approximately
bandwidth 9.8 of MHzthe [35]
UAVso RFthe sampling
signals is
frequency is fw =9.8
approximately 19.53
MHz kHz.
[35] so the sampling frequency is fw = 19.53 kHz.
Figure1.1.The
Figure The system
system model
model for UAV
for UAV detection
detection including
including the receiver,
the receiver, UAV, and
UAV, computer computer and
controller.
controller.
The receiverThe receiver
is used is usedthe
to collect to RF
collect the and
signals RF signals and theis computer
the computer is usedprocessing.
used for signal for signal
processing.
The The bidirectional
bidirectional arrow denotesarrow denotes communications
communications between
between the UAV and the UAV and controller.
controller.
Table1.1.The
Table Themain
mainreceiver
receiverparameters.
parameters.
Parameter
Parameter Value
Value
Gain Gain 24 dBi
24 dBi
Beamwidth
Beamwidth 10°10◦
Frequency rangerange
Frequency 2.3 2.3 GHz–2.7
GHz–2.7 GHz GHz
Azimuth angle 0 ◦ –180◦
Azimuth angle 0°–180°
receiver dynamic
receiver rangerange
dynamic 72 dB
72 dB
Thedevice
The deviceemployed
employed in in
thethe experiments
experiments is a Phantom
is a Phantom 4 UAV4 (DJI,
UAVShenzhen,
(DJI, Shenzhen, China)weighs
China) which which
weighs
about 400about
g and400 g and communicates
communicates with theusing
with the controller controller
an RFusing
signalanat RF signal All
2.4 GHz. at experiments
2.4 GHz. All
experiments
were conductedwere
in aconducted in a realoutdoors
real environment environment outdoors
at Ocean at Ocean
University University
of China of China
where there were where
other
there were other devices that share the 2.4 GHz band. Figure 2 shows the measurement
devices that share the 2.4 GHz band. Figure 2 shows the measurement locations for the experiments. locations for
thethe
In experiments. In the first
first experiment, the UAVexperiment,
hoveredthe atUAV hovered
a height at m
of 100 a height of 100 m
at distances of at
500distances
m, 1500ofm,500 m,
and
1500 m, and 2400 m from the receiver. This experiment was used for UAV detection using the
proposed algorithm. In the second experiment, the UAV was at a distance of 2500 m and 2800 m
Sensors 2019, 19, 274 4 of 22
2400 m from the receiver. This experiment was used for UAV detection using the proposed algorithm.
In the second
Sensors experiment,
2019, 19, the UAV was at a distance of 2500 m and 2800 m from the receiver
x FOR PEER REVIEW 4 of 21 with
several interference signals having amplitudes larger than the UAV signal. This experiment was used
from
to test thethe
Sensors receiver
2019,
noise 19, with
x FOR
and several
PEER REVIEW
interference interference
suppressionsignals having amplitudes
performance largeralgorithm.
of the proposed than the UAVTwosignal.
4 of 21 were
UAVs
This experiment was used to test the noise and interference suppression performance of the
considered in the third experiment at distances of 2400 m and 2500 m from the receiver as shown in
from thealgorithm.
proposed receiver with
Twoseveral interference
UAVs were signals
considered having
in the thirdamplitudes
experiment larger than the
at distances UAVmsignal.
of 2400 and
Figure 2b.
Thism experiment
2500 was used
from the receiver to test
as shown the noise
in Figure 2b. and interference suppression performance of the
proposed algorithm. Two UAVs were considered in the third experiment at distances of 2400 m and
2500 m from the receiver as shown in Figure 2b.
(a) (b)
Figure 2. The
2. The experimentlocations
experiment locations (a)
(a) UAV
UAV hovering
hoveringatataaheight
heightofof
100 mm
100 and 500500
and m, 1500 m, 2400
Figure (a) (b) m, 1500 m, 2400 m,
m, 2500 m, and 2800 m from the receiver, and (b) two UAVs at distances of 2400 m and 2500 m from
2500 m, and 2800 m from the receiver, and (b) two UAVs at distances of 2400 m and 2500 m from
the receiver.
Figure 2. The experiment locations (a) UAV hovering at a height of 100 m and 500 m, 1500 m, 2400
the receiver.
m, 2500 m, and 2800 m from the receiver, and (b) two UAVs at distances of 2400 m and 2500 m from
3. Proposed
3. Proposed Method
the receiver.
Method
Figure 3 gives a flowchart of the proposed detection algorithm. In this algorithm, the static and
Figure
3. Proposed3 gives a flowchart of the proposed detection algorithm. In this algorithm, the static
Method
non-static clutter are suppressed using background filtering and linear trend suppression (LTS). The
andnoise
non-static
isFigure clutter
reduced3 givesare
usinga an suppressed
flowchart
automaticof theusing
gain background
proposed
control detection
(AGC) filtering
algorithm.
technique, andand
In linear
this
the trend
algorithm,
signal suppression
to noise the static
ratio (SNR)and(LTS).
Theisnoise is reduced
non-static
improved clutter using
usingare an automatic
ansuppressed
SVD gain control
using background
algorithm. Spectrum (AGC)and
filtering
accumulation technique,
linear andand
(SA) trend the signal
suppression
statistical to noise
(LTS).
fingerprint The ratio
(SNR) is improved
noise
analysis is(SFA)
reduced using
using an
methods SVD
areautomaticalgorithm.
employed gain Spectrum
control
to provide (AGC) accumulation
technique,
two frequency (SA)
and the
estimates andRF
signal
of the statistical
to noise ratio
signals. fingerprint
(SNR)
These
analysis (SFA)
is improved
estimates aremethods
usingto an
used areSVDemployed
determine to provide
algorithm.
if a UAV Spectrum two
is present infrequency
accumulation estimates
and of
(SA)environment.
the detection the RF
statistical signals. of These
fingerprint
A region
analysis
interestare
estimates (SFA)
(ROI) methods
containing
used are employed
the azimuth
to determine if a UAV to provide
information
is present two
is definedfrequency estimates
using these
in the detection of the RF
estimates to reduce
environment. signals.
A region These
the data
of interest
(ROI) estimates
size and improve
containing arethe
used
the to
azimuthdetermine
system if a UAV
efficiency.
information issection,
Inisthis present
defined usingRFin signals
the detection
these acquired
estimatesenvironment.
at
toareduce
distance Aofregion
the 500 m
data of and
size
interest
between
improve thethe(ROI) containing
UAV and
system the
receiverIn
efficiency. azimuth
are used
this information
to illustrate
section, is defined
the proposed
RF signals using
acquired these
method. estimates to reduce the
at a distance of 500 m between the data
size and improve the system efficiency. In this section, RF signals acquired at a distance of 500 m
UAV and receiver are used to illustrate the proposed method.
between the UAV and receiver are used to illustrate the proposed method.
M N
1
M × N m=1 n∑
∑
B= A(m, n) (1)
=1
and this is subtracted from each value in A giving the matrix C. The LTS algorithm is used to suppress
the non-static clutter. The result is [52]:
−1 T
T T T T
D = C −X X X X C (2)
where:
0 1
1 1
X = [x1 x2 ] = .
.. (3)
.. .
N−1 1
where gmin (i, n) is the minimum gain for each value of n which can be expressed as:
and e(i, n) is the power of the signal in a window of length ξ given by:
v
u w + i −1
e(i, n) = t ∑ D (i, n)2
u
(7)
k =i
Using AGC, gmax can be predetermined using the calculated gain values. In this paper, gmax = 0.2.
From (4), the M × N azimuth -frequency matrix E is:
E = D × gmask (8)
E = USVT
· · ·
· · ·
= σ1 µ1 (υ1 . . .) + σk µk (. . . υk . . .) + · · · + σN (. . . υ N . . .) (9)
µN
· · ·
· · ·
= M1 + M2 + · · · + Mk + · · · + M N
where T denotes matrix transpose, S is a diagonal matrix, UM × M and VN × N are unitary matrices,
and Mk is the kth intrinsic image with the same dimensions as E. The singular values σi in S satisfy
σ1 ≥ σ2 ≥ σ3 ≥ · · · ≥ σr ≥ 0. Equation (9) can be expressed as:
where MUAV is the effective RF signal and Mnoise is noise. To remove the noise, the singular values for
i > 20 are set to zero [54]. The resulting M × N azimuth -frequency matrix F is then [55]:
20
F= ∑ ui σi vTi (11)
i =1
To further improve the SNR, Q points in the azimuth dimension are averaged giving [56]:
( λ +1) Q −1
1
G (λ, n) =
Q ∑ F (m, n) (12)
m=λQ
where λ = 1, · · · , ℵ. ℵ = b M/Qc is the largest integer less than M/Q [57]. G is then an ℵ × N azimuth
-frequency matrix. In this paper, Q = 12. Note that the result in (12) must satisfy the Nyquist sampling
theorem. Further, using (12) not only increases the SNR but also reduces the data size which improves
the efficiency.
is used where H() denotes the Hilbert transform. The frequency estimate is then:
β = w(α) (15)
where:
α = argmax{ P(λ)} (16)
λ=1,··· ,ℵ
is the index of the peak in P. ω denotes a value in the frequency range 2.4 GHz–2.5 GHz. Figure 4
shows the frequency estimation results using the SA method for the acquired RF signals.
Sensors 2019, 19, x FOR PEER REVIEW 7 of 21
Sensors 2019, 19, 274 7 of 22
Sensors 2019, 19, x FOR PEER REVIEW 7 of 21
Figure 4. Frequency estimation results using the SA method for the RF signals acquired at a distance
of 500 m between the UAV and receiver.
Figure4.4.Frequency
Figure Frequencyestimation
estimationresults
resultsusing
usingthe
the
SASA method
method forfor
thethe
RFRF signals
signals acquired
acquired at aatdistance
a distance
of
of
500 500
m m between
between the the
UAV UAV
andand receiver.
receiver.
3.5. Statistical Fingerprint Analysis
3.5.
3.5.Statistical Fingerprint
In this section,
Statistical Analysis
a frequency
Fingerprint Analysis estimation method is given which is based on statistical features of
the In
UAV signals. These features include method
skewness [53], kurtosisis [54], and standard features
deviation
of [55].
Inthis
thissection,
section,a afrequency
frequency estimation
estimation method is given which
is given which based on statistical
is based on statistical features theof
UAVThe goal is
signals. to detect
TheseThese the
features presence of
includeinclude a
skewness UAV in a given
[58], kurtosis surveillance
[59], and[54], area.
standard Figure 5 shows
deviationdeviation these
[60]. The goal
the UAV signals. features skewness [53], kurtosis and standard [55].
features
isThe
to detectfor the receivedofRF signals aatgiven
a distance of 500 m. ThisFigure
indicates that the standard deviation
goal isthe topresence
detect the a UAV
presence in of a UAV surveillance
in a givenarea. surveillance 5 shows these
area. Figure features
5 showsforthese
the
is better
received able
RF to
signalsidentify
at a UAV
distance signals.
of 500 Thus,
m. Thisthe frequency
indicates estimate
that the is:
standard deviation is better able to
features for the received RF signals at a distance of 500 m. This indicates that the standard deviation
identify
is betterUAV
able signals. Thus,
to identify UAV thesignals.
frequency estimate
Thus, w (υ ) estimate is:
theτfrequency
=is: (17)
where: (υ()υ )
τ =ww
τ= (17)
(17)
where:
where: υ = arg max {Φ ( j )} (18)
j
{ΦΦ( j)}
υ =υ argmax
= arg max
j j
{ ( j )} (18)
(18)
Φ [ j ] = H ( Ψ ( j2) )
2
(19)
Φ[ j] = |H(Ψ( j))| 2 (19)
and Ψ denotes the standard deviation of the j ] = H (in
Φ [values ( j ) ) row of I. υ is the index of the peak in(19)
Ψ each Ψ.
and Ψ denotes the standard deviation of the values in each row of I. υ is the index of the peak in Ψ.
Figure 6 shows the frequency estimation results using the SFA method with the acquired RF signals.
Figure
and Ψ6denotes
shows the
thefrequency estimationofresults
standard deviation usinginthe
the values SFArow
each method
of I. υwith theindex
is the acquired
of theRF signals.
peak in Ψ.
Figure 6 shows the frequency estimation results using the SFA method with the acquired RF signals.
Figure 5. The features extracted from the RF signal acquired at a distance of 500 m between the UAV
Figure 5. The features extracted from the RF signal acquired at a distance of 500 m between the UAV
and receiver.
and receiver.
Figure 5. The features extracted from the RF signal acquired at a distance of 500 m between the UAV
and receiver.
Sensors 2019, 19, 274 8 of 22
Sensors 2019, 19, x FOR PEER REVIEW 8 of 21
Figure
Figure 6. Frequency
6. Frequency estimation
estimation results
results using
using the
the SFA
SFA methodwith
method withRF
RFsignals
signalsacquired
acquiredatataadistance
distanceof
500ofm500 m between
between the UAV
the UAV and receiver.
and receiver.
Figure 6. Frequency estimation results using the SFA method with RF signals acquired at a distance
3.6.3.6.
UAV ofDetermination
UAV500 m between the UAV and receiver.
Determination
AsUAV
3.6. mentioned
As mentioned previously,
Determination previously,the thebandwidth
bandwidth of of the
the RF RF signal
signal is 9.8 9.8 MHz.
MHz. ThisThiscan canbebeused
usedasas aa
threshold
threshold to determine
to determine if a UAV
if a UAVis present in the in
is present detection
the detectionenvironment.
environment.The errorThebetween τ obtained
error between τ
As mentioned
obtained previously,and the bandwidth of thetheRF signal is 9.8 MHz. by: This can be used as a
using the SAusing
method the SAandmethod
β obtainedβusing obtained using
the SFA method SFA ismethod
given by: is given
threshold to determine if a UAV is present in the detection environment. The error between τ
obtained using the SA method and β obtained δ = τ −the β (20)
δ =using
|τ − β| SFA method is given by: (20)
A UAV is assumed to be present in the detection δ = τ − environment
β if δ ≤ 9.8 MHz. (20)
A UAV is assumed to be present in the detection environment if δ ≤ 9.8 MHz.
A UAV
3.7. Data Size is assumed to be present in the detection environment if δ ≤ 9.8 MHz.
Reduction
3.7. Data Size Reduction
To disable
3.7. Data an unauthorized UAV in a private or sensitive area, the UAV location must be
Size Reduction
To disableThus
determined. an unauthorized
in this section, aUAV ROI in a private
is obtained to or sensitive
improve area, the
the system UAV location
efficiency, reduce the must
databe
To
determined. disable
Thus in
size, and increase an unauthorized
thethis UAV
section,accuracy.
estimation in
a ROI is As a private
obtained or sensitive
to improve
the bandwidth area,
of thethe the
RFsystem UAV location
signals efficiency,
is 9.8 MHz,reduce must
they arebe
the
datadetermined.
size, and
sampled Thus
at aincrease in
frequency this
theof section,
estimation a ROI is
19.53 kHz,accuracy. obtained
so an area to
Asinthe improve
thebandwidth the
range [τ, τ of system
the RF
+ 500] efficiency,
signalsas
is defined reduce
is the the
9.8 MHz, data
ROI. Thethey
size,
areresult and
sampled increase
at a the
of using the estimation
frequency
ROI of 19.53accuracy.
is illustrated kHz,
in FiguresoAs
an7the
areabandwidth
using inthethedata of
range the
[τ,RF
acquired τ +signals
at500] isis defined
a distance9.8 MHz, asthey
of 500 are
theThe
m. ROI.
sampled at a frequency of 19.53 kHz, so an area in the range [τ, τ + 500] is defined as the ROI. The
Theupper
resultfigure shows
of using thetheROIreceived RF signals
is illustrated before 7the
in Figure ROI the
using is used
dataand the lower
acquired at afigure afterwards.
distance of 500 m.
result results
of using the ROI is illustrated in Figure 7 usingreduces
the datathe acquired at ainterference.
distance of 500 m. The
TheThese
upper figureindicate
shows the that this approach
received significantly
RF signals before the ROI is used noise andandthe lower figure afterwards.
upper figure shows the received RF signals before the ROI is used and the lower figure afterwards.
These results indicate that this approach significantly reduces the noise and interference.
These results indicate that this approach significantly reduces the noise and interference.
Figure 7. The received RF signals acquired at a distance of 500 m between the UAV and receiver. The
upper figure shows all received RF signals and the lower figure shows the signals in the ROI.
Figure
Figure 7. The
7. The receivedRF
received RFsignals
signals acquired
acquiredatata adistance
distanceof of
500500
m between the UAV
m between and receiver.
the UAV The
and receiver.
Theupper
upperfigure
figureshows
showsallallreceived RFRF
received signals and
signals thethe
and lower figure
lower shows
figure the signals
shows in the
the signals inROI.
the ROI.
Sensors 2019, 19, 274 9 of 22
Sensors 2019, 19, x FOR PEER REVIEW 9 of 21
3.8. Azimuth
3.8. Azimuth Estimation
Estimation
To estimate
To estimate the azimuth
the azimuth between
between the and
the UAV UAVreceiver,
and receiver,
only theonly the in
signals signals in are
the ROI the considered.
ROI are
considered. The azimuth can
The azimuth can be expressed as: be expressed as [56]:
= (ηκ()κ )
γ =γ η (21)
(21)
where:
where:
κ = argmax{T(ε)} (22)
ε max {Τ ( ε )}
κ = arg (22)
ε
(a) (b)
(c) (d)
Figure 8. Cont.
Sensors 2019, 19, 274 10 of 22
Sensors
Sensors 2019,
2019, 19, 19, x FOR
x FOR PEER
PEER REVIEW
REVIEW 10 of1021of 21
(e) (f)
(e) (f)
Figure Figure
8. (a)8.The
(a) The received
received RF RF signals,
signals, andand the
the resultsafter
results after(b)
(b)background
background filtering,
filtering,(c)
(c)AGC,
AGC, (d)(d)
LTS,
LTS,
(e)
(e)Figure SVD,
SVD, and and
8. (a)(f)The(f) average
received
average filtering.
RF signals, and the results after (b) background filtering, (c) AGC, (d) LTS,
filtering.
(e) SVD, and (f) average filtering.
Figure Figure 8a shows
8a shows the results
the results afterafter spectrum
spectrum sensing
sensing which which indicate
indicate thatthat the signals
the RF RF signals are weak
are too too
weak to
Figure extract due to noise and large amplitude clutter. Background filtering effectively suppresses
to extract due8a to shows
noise and the large
resultsamplitude
after spectrum
clutter.sensing
Backgroundwhich filtering
indicateeffectively
that the RFsuppresses
signals arethe too
the clutter as shown in Figure 8b. The results using the AGC method are given in Figure 8c and show
weak as
clutter to shown
extract due to noise
in Figure 8b.andThelarge amplitude
results using theclutter.
AGC Background
method are given filtering in effectively
Figure 8c and suppresses
show
increased noise suppression. The results after using LTS in Figure 8d indicate an improvement in the
the clutter
increased as shown
noise in FigureThe
suppression. 8b. results
The results
afterusing
usingtheLTSAGC method
in Figure 8dare given in
indicate anFigure 8c and show
improvement in
RF signal. Figures 8e,f give the results after the SVD algorithm and the average filter, respectively.
increased
the RF noise
signal. Figure suppression.
8e,f8a,fgive The results
the results after using LTS in Figure 8d indicate an improvement in the
Comparing Figures confirms thatafter the SVD algorithm
the proposed and the average
method significantly reduces filter,
the respectively.
noise and
RF interference.
signal. Figures
Comparing Figure 8e,f
SNRgive
The8a,f the results
confirms
improvement after
that using
the theproposed
SVDmethod
proposed
the algorithm
method and the average
significantly
is evaluated using filter,
reduces thethe respectively.
noise RF
received and
Comparing
signals at different distances and is compared with the corresponding improvement using the ANNand
interference. Figures
The SNR 8a,f confirms
improvement that
usingthe proposed
the proposed method
method significantly
is evaluated reduces
using the noise
received
RFinterference.
signals The
at different
and acoustic SNR
array improvement
distances
methods. and
The is using
compared
SNR thewith
is defined proposed method is evaluated
as: the corresponding improvement usingusingthe received
the ANN RF
signals
and at different
acoustic distances
array methods. Theand is compared
SNR is defined with
as: the corresponding improvement using the ANN
N τ + 500 N M
and acoustic array methods. The SNR = 20 log10 G ( i, j
SNR is defined as: ) G ( i, j ) ! (24)
N τ + j =500
1 i =τ Nj =1 iM =1
SNR = 20 log10 ∑ ∑ N τ +|500G (i, j)| ∑N∑M| G (i, j)| (24)
The SNR using the proposedSNR =method
20 logj=101is =τ dB
i−11.39 G ( ifor j=
, j )500 1m, G ( i, dB
i =1−14.57 j ) for 1500 m, −22.46 dB for(24)
2400 m, −23.48 dB for 2500 m, and −27.93 dB j =1fori =τ2800 m. These j =1 i =1
results show that the power of the
The SNRsignal
received using decreases
the proposedwithmethod
distance, is −11.39 dB forFor 500500 m, m, −14.57 dB for 1500 m,dB −22.46
usingdB for
The SNR using the proposed method as expected.
is −11.39 dB for 500 The SNR
m, −14.57 dB is
for−22.89
1500 m, −22.46 the
dB for
m, −23.48
2400 ANN method dBand
for−23.56
2500 m,dB and
using−the
27.93 dB forarray
acoustic 2800method.
m. These Thus, results
these show
methodsthathave
the much
powerlower
of the
2400 m, −23.48 dB for 2500 m, and −27.93 dB for 2800 m. These results show that the power of the
received
SNRssignal decreases
at 500 m whichwith distance,
indicates poor asUAV
expected. detectionFor 500 m, The SNR
performance. The is − 22.89 for
results dB using the ANN
the proposed
received signal decreases with distance, as expected. For 500 m, The SNR is −22.89 dB using the
method and −23.56
algorithm are shown in Figure
dB using 9a for array
the acoustic 1500 m and Figure
method. Thus, 9bthese for 2400
methodsm. These
haveshow
muchthat theSNRs
lower RF
ANN method and −23.56 dB using the acoustic array method. Thus, these methods have much lower
at 500signals are improved
m which significantly
indicates poor and the noise
UAV detection is suppressed.
performance. The results for the proposed algorithm are
SNRs at 500 m which indicates poor UAV detection performance. The results for the proposed
shown in Figure 9a for 1500 m and Figure 9b for 2400 m. These show that the RF signals are improved
algorithm are shown in Figure 9a for 1500 m and Figure 9b for 2400 m. These show that the RF
significantly and the noise is suppressed.
signals are improved significantly and the noise is suppressed.
(a) (b)
Figure 9. The detection results using the proposed algorithm based on the RF signals acquired at
distances of (a) 1500 m, and (b) 2400 m.
(a) (b)
4.2. Detection Performance in Strong Interference
Figure9. 9.The
Figure Thedetection
detectionresults
resultsusing
usingthe
theproposed
proposedalgorithm
algorithmbased
basedononthe
theRF
RFsignals
signalsacquired
acquiredatat
distances of (a) 1500 m, and (b) 2400
distances of (a) 1500 m, and (b) 2400 m. m.
(a) (b)
(a) (b)
Figure 10. The RF signals acquired in strong interference at distances of (a) 2500 m, and (b) 2800 m.
Figure10.
Figure 10.The
TheRF
RFsignals
signalsacquired
acquiredininstrong
stronginterference
interferenceatatdistances
distancesofof(a)
(a)2500
2500m,
m,and
and(b)
(b)2800
2800m.
m.
(a) (b)
(a) (b)
Figure 11. The results for the proposed algorithm using RF signals acquired
at aatat a distance of (a) 2500
Figure 11.The
Figure11. Theresults
resultsfor
forthe proposed
the algorithm
proposed algorithmusing RFRF
using signals acquired
signals acquired distance of (a)
a distance of 2500 m,
(a) 2500
m,
and and (b)
(b) 2800 2800
m. m.m.
m, and (b) 2800
Table
Figure2. SNR
10 (dB) usingthat
shows fourthere
methodsarewith the data
several RFobtained at distances
interferers in theof detection
2500 m and environments.
2800 m. ANN: The
Figure neural
artificial 10 shows thatCFAR:
network, thereconstant
are several RF interferers
false alarm in the detection
rate, HOC: higher-order environments. The
cumulant.
circular regions denote interference signals with larger amplitudes than the UAV signal. Thus, this
circular regions denote interference signals with larger amplitudes than the UAV signal. Thus, this
is challenging detection problem. The results
MethodThe using the proposed method are given in Figure 11.
is challenging detection problem. results2500 m the
using 2800 m
proposed method are given in Figure 11.
This shows that the interference is effectively suppressed which will improve the detection
This shows that the interference
Proposed is effectively
−25.72suppressed −which
29.71 will improve the detection
performance. The SNR for the proposed, ANN, CFAR, and HOC
−38.35 methods is given in Table 2. This
−43.62
performance. The SNR for theANN proposed, ANN, CFAR, and HOC methods is given in Table 2. This
shows that the proposed method
HOC provides the best SNR improvement
−31.46 −35.72 while the ANN method has
shows that the proposed method provides the best SNR improvement while the ANN method has
the worst improvement. CFAR − 29.57 − 33.78
the worst improvement.
Table 2. SNR (dB) using four methods with the data obtained at distances of 2500 m and 2800 m.
Table 2. SNR (dB) using four methods with the data obtained at distances of 2500 m and 2800 m.
ANN: artificial neural network, CFAR: constant false alarm rate, HOC: higher-order cumulant.
ANN: artificial neural network, CFAR: constant false alarm rate, HOC: higher-order cumulant.
Method 2500 m 2800 m
Method 2500 m 2800 m
Proposed −25.72 −29.71
Proposed −25.72 −29.71
ANN −38.35 −43.62
ANN −38.35 −43.62
HOC −31.46 −35.72
Sensors 2019, 19, 274 12 of 22
(a) (b)
Figure 12.
Figure 12. Frequency
Frequency estimation
estimation results
results using
using the
the (a)
(a) SA,
SA, and
and (b)
(b)SFA
SFAmethods.
methods.
Table3.3.Frequency
Table Frequencyestimation
estimationaccuracy
accuracywith
withthree
threemethods.
methods. CFAR:
CFAR: constant
constant false
false alarm
alarm rate,
rate, HOC:
HOC:
higher-order cumulant.
higher-order cumulant.
Error Error
Method
Method 500500
m m 1500 1500
m 2400m m 2400
2500mm 2800
2500 m 2800 m
(MHz)
(MHz) (MHz) (MHz)(MHz) (MHz)
(MHz) (MHz) (MHz)
τ τ 0. 50.5 0.5 0.5 0. 1 0.10.5 0.5
0.5 0.5
Proposed Proposed
β β 0. 50.5 0. 5 0.5 0. 1 0.10. 5 0.
0.55 0.5
HOC 2.3 5.7 18 15 37
HOC 2.3 5.7 18 15 37
CFAR 4.5 19 27 69 91
CFAR 4.5 19 27 69 91
(a) (b)
Method 500 m 1500 m 2400 m 2500 m 2800 m
(MHz) (MHz) (MHz) (MHz) (MHz)
τ 0. 5 0.5 0. 1 0.5 0.5
Proposed
β 0. 5 0. 5 0. 1 0. 5 0. 5
HOC 2.3 5.7 18 15 37
Sensors 2019, 19, 274 CFAR 4.5 19 27 69 91 13 of 22
(c) (d)
(e)
Figure 13. RF
Figure RF signals
signals in
inthe
theROI
ROIatataadistance
distanceofof(a)(a)500
500m,m,
(b)(b)
1500 m,m,
1500 (c)(c)
2500 m, m,
2500 (d)(d)
2800 m, m,
2800 andand
(e)
2400
(e) m. m.
2400
The
The azimuth
azimuthestimation
estimationinin thethe
ROIROI
using SFA SFA
using is shown in Figure
is shown 16 for five
in Figure distances.
16 for The azimuth
five distances. The
estimates with the with
proposed algorithm ◦
are 101 are for101°
500 for
m, 500 ◦
101 m,for 1500 ◦ ◦
azimuth estimates the proposed algorithm 101° form, 110m,for
1500 2500
110° for m,
2500110
m,
for ◦
110°2800
for m, and
2800 m,67andfor67°
2400form.2400
Table
m.4Table
gives 4thegives
azimuth estimation
the azimuth accuracy accuracy
estimation with threewith
methods.
three
These results indicate the proposed method has the smallest error. The detection
methods. These results indicate the proposed method has the smallest error. The detection rates rates using three
algorithms
using threeare given in Table
algorithms 5 and in
are given show that5 better
Table and showUAV that
detection
betterperformance
UAV detectionis obtained with the
performance is
proposed algorithm.
obtained with the proposed algorithm.
The azimuth estimation in the ROI using SFA is shown in Figure 16 for five distances. The
azimuth estimates with the proposed algorithm are 101° for 500 m, 101° for 1500 m, 110° for 2500 m,
110° for 2800 m, and 67° for 2400 m. Table 4 gives the azimuth estimation accuracy with three
methods. These results indicate the proposed method has the smallest error. The detection rates
using
Sensors 2019,three algorithms are given in Table 5 and show that better UAV detection performance
19, 274 is 22
14 of
obtained with the proposed algorithm.
Sensors2019,
Sensors 2019,19,
19,xxFOR
FORPEER
PEERREVIEW
REVIEW 14 of
14 of 21
21
(a) (b)
(c)
(c) (d)
(d)
(e)
(e)
Figure14.
Figure 14.Frequency
Frequencyestimation
estimation inthe
theROI
ROIusing
usingSA
SAat
atdistances
distances of(a)
(a) 500m,
m, (b)1500
1500m,
m,(c)
(c)2500
2500m,
m,
Figure 14. Frequency estimation ininthe ROI using SA at distances of
of (a)500
500 m,(b)
(b) 1500 m, (c) 2500 m,
(d) 2800 m, and (e) 2400 m.
(d) (d) 2800
2800 m, m,
andand
(e)(e) 2400
2400 m.m.
(a)
(a) (b)
(b)
(a) (b)
(e)
(e)
Figure 15. Frequency estimation in the ROI using SFA at distances of (a) 500 m, (b) 1500 m, (c) 2500
Figure 15. Frequency estimation in the ROI using SFA at distances of (a) 500 m, (b) 1500 m, (c) 2500 m,
Figure
m, 15. Frequency
(d) 2800 m, and (e) estimation
2400 m. in the ROI using SFA at distances of (a) 500 m, (b) 1500 m, (c) 2500
(d) m,
2800
(d)m, and
2800 m,(e)
and2400
(e) m.
2400 m.
(a) (b)
(a) (b)
(c) (d)
(c) (d)
(c) (d)
(e)
Figure
Figure 16.16. Azimuthestimation
Azimuth estimationin
inthe
theROI
ROI using
using SFA at
at distances
distancesofof(a)
(a)500
500m,m,(b)
(b)1500 m,m,
1500 (c)(c)
2500 m, m,
2500
(d)(d) 2800
2800 m,m,andand
(e)(e) 2400m.m.
2400
Table 4. Azimuth estimation accuracy using three methods. CFAR: constant false alarm rate, HOC:
higher-order cumulant.
Error (◦ )
Method
500 m 1500 m 2400 m 2500 m 2800 m
Proposed 3.86 5.18 3.35 4.27 7.24
HOC 11.25 19.37 34.96 8.39 12.49
CFAR 7.68 9.24 13.86 9.27 11.24
Table 5. Detection rate (%) using three methods. CFAR: constant false alarm rate, HOC:
higher-order cumulant.
Table 6. Parameter estimates for two UAVs using four different methods. ANN: artificial neural
network, CFAR: constant false alarm rate, HOC: higher-order cumulant.
2400 m 2500 m
Method Parameter
Estimate Error Estimate Error
Sensors 2019, 19, x FOR PEER REVIEW 17 of 22
Frequency (GHz) τ
Sensors 2019, 19, x FOR PEER REVIEW 2.463 0.005 2.427 0.005 17 of 22
Proposed Frequency (GHz) β 2.463 0.005 2.427 0.005
Frequency
Azimuth (GHz) τ 2.463
65 ◦ 0.0055◦ 2.427 112◦0.005 12◦
Frequency (GHz) τ 2.463 0.005 2.427 0.005
Proposed Frequency (GHz) β 2.463 0.005 2.427 0.005
Proposed Frequency
Frequency (GHz) β
(GHz) 2.463
2.446 0.005
0.022 2.4272.4530.005 0.031
ANN Azimuth 65°
◦ 5° ◦ 112° 12°
Azimuth
Azimuth 3065° 5°30 112° 76◦ 12° 34◦
Frequency (GHz) 2.446 0.022 2.453 0.031
Frequency
ANN Frequency (GHz)(GHz) 2.446
2.439 0.022
0.029 2.453 0.031
HOC ANN Azimuth 30° 30° 76° 2.41234° 0.120
Azimuth
Azimuth (GHz) 2230°
◦ 30°
38◦ 76° 56◦ 34° 54◦
Frequency 2.439 0.029 2.412 0.120
HOC Frequency (GHz) 2.439 0.029 2.412 0.120
HOC Frequency (GHz)
Azimuth 2.436
22° 38°0.102 56° 2.48454° 0.062
CFAR Azimuth 22° 38° ◦ 56° 54°
Azimuth (GHz)
Frequency 37◦
2.436 0.10223 ◦
2.484 70 0.062 40◦
CFAR Frequency (GHz) 2.436 0.102 2.484 0.062
CFAR Azimuth 37° 23° 70° 40°
Azimuth 37° 23° 70° 40°
(a) (b)
(a) (b)
Figure 17. (a) The received RF signals and (b) the results using the proposed method, with two
Figure
Figure 17.17.
(a)(a)
TheThe received
received RF signals
RF signals andthe
and (b) (b)results
the results
using using the proposed
the proposed method,method, with
with two two
UAVs
UAVs at distances of 2400 m and 2500 m.
atUAVs at distances
distances of 2400 mofand
24002500
m and
m. 2500 m.
(a) (b)
(a) (b)
Figure18.
Figure 18.Frequency
Frequencyestimates
estimatesusing
usingthe
the(a)
(a)SA,
SA,and
and(b)(b) SFA
SFA methods,
methods, with
with two
two UAVs
UAVs at distances
at distances of
Figure 18. Frequency estimates using the (a) SA, and (b) SFA methods, with two UAVs at distances
of 2400
2400 m 2500
m 2500 m. m.
of 2400 m 2500 m.
Sensors 2019, 19, 274 18 of 22
Sensors 2019, 19, x FOR PEER REVIEW 18 of 22
Figure 19.
Figure 19. Azimuth
Azimuth estimates
estimates using
using the
the SFA
SFAmethod
methodwith
withUAVs
UAVsatatdistances
distancesofof2400
2400mmand
and2500
2500m.
m.
5.5.Conclusions
Conclusions
UAV
UAVdetection
detectionis is
veryveryimportant
importantduedueto the
to threats they can
the threats theypose
cantopose
personal and public
to personal andprivacy
public
and safety.
privacy andIn this paper,
safety. a method
In this paper,was proposed
a method which
was employs
proposed background
which employsfiltering to suppress
background static
filtering to
clutter, and non-static clutter was reduced using linear trend suppression (LTS).
suppress static clutter, and non-static clutter was reduced using linear trend suppression (LTS). Further, automatic
gain control
Further, (AGC) was
automatic gainused to remove
control (AGC) noise
wasand improve
used the SNR,
to remove and
noise theimprove
and SNR wasthe further
SNR, increased
and the
using
SNR wassingular value
further decomposition
increased (SVD). The
using singular spectrum
value accumulation
decomposition (SVD). (SA)
Theand statistical
spectrum fingerprint
accumulation
analysis
(SA) and (SFA) methodsfingerprint
statistical were employed to obtain
analysis (SFA)frequency
methodsestimates. A region of
were employed to interest
obtain (ROI) was
frequency
defined using
estimates. A these
regionestimates to improve
of interest (ROI) wasthe system
defined efficiency and provide
using these accurate
estimates azimuththe
to improve estimates.
system
To validate the
efficiency andproposed
providealgorithm,
accurate experiments were conducted
azimuth estimates. in a real outdoor
To validate environment,
the proposed and
algorithm,
the results obtained indicate that it provides excellent UAV detection performance.
experiments were conducted in a real outdoor environment, and the results obtained indicate that it The recognition
rate with this
provides method
excellent UAV was near 100%
detection within a distance
performance. of 2400 rate
The recognition m and
withgreater than 90%
this method waswithin
near 100%a
distance of 3000 m, which is better than with other well-known detection algorithms
within a distance of 2400 m and greater than 90% within a distance of 3000 m, which is better than in the literature.
Further,
with otherit was shown thatdetection
well-known the azimuth and frequency
algorithms in theofliterature.
multiple UAVs can be
Further, accurately
it was shownestimated
that the
using
azimuththe proposed algorithm.
and frequency of multiple UAVs can be accurately estimated using the proposed
algorithm.
Author Contributions: Conceptualization, X.L.; methodology, X.L.; validation, X.L. formal analysis, X.L.;
investigation, X.L.; resources, X.L.; writing-original draft preparation, X.L.; writing-review and editing, T.A.G.;
Author Contributions:
supervision, H.Q.; fundingconceptualization, X.L.;
acquisition, S.Y. and H.Q.methodology, X.L.; validation, X.L. formal analysis, X.L.;
investigation, X.L.; resources, X.L.; writing-original draft preparation, X.L.; writing-review and editing, T.A.G.;
Funding: This work was funded by the National High Technology Research and Development Program of China
supervision, H.Q.; funding acquisition, S. Y. and H.Q.
(2012AA061403), the National Science & Technology Pillar Program during the Twelfth Five-year Plan Period
(2014BAK12B00),
Funding: This workthe National
was fundedNatural Science
by the Foundation
National of China (61501424,
High Technology Research and61701462 and 41527901),
Development Programtheof
Ao Shan Science and Technology Innovation Project of Qingdao National Laboratory for Marine Science and
China (2012AA061403), the National Science & Technology Pillar Program during the Twelfth Five-year Plan
Technology (2017ASKJ01), the Qingdao Science and Technology Plan (17-1-1-7-jch), and the Fundamental Research
Periodfor
Funds (2014BAK12B00), the National
the Central Universities Natural Science Foundation of China (61501424, 61701462 and 41527901),
(201713018).
the Ao Shan Science and Technology Innovation Project of Qingdao National Laboratory for Marine Science
Acknowledgments: The authors would like to thank the anonymous reviewers for their valuable comments and
and Technology (2017ASKJ01), the Qingdao Science and Technology Plan (17-1-1-7-jch), and the Fundamental
suggestions to improve the quality of the article.
Research Funds for the Central Universities (201713018).
Conflicts of Interest: The authors declare no conflict of interest.
Acknowledgments: The authors would like to thank the anonymous reviewers for their valuable comments
and suggestions
Appendix A to improve the quality of the article.
Conflicts of Interest:
The main The authors
variables declare
and their no conflict
description areofgiven
interest.
in Table A1.
Appendix A
The main variables and their description are given in Table A1.
Variable Description
Sensors 2019, 19, 274 19 of 22
Variable Description
gmax maximum gain
gnorm [i, n] normalized gain
gmin [i, n] minimum gain
e[i, n] RF signal power in a window of length w
S diagonal matrix
UM × M unitary matrix
VN × N unitary matrix
σi singular values
Mk kth intrinsic image
MUAV effective RF signals
Mnoise noise
n the number of selected singular values
α index of the peak in P
ω frequency value in the range 2.4 GHz–2.5 GHz
Ψ standard deviation of I in the frequency domain
υ index of the peak in Ψ
β frequency estimate using the SFA method
τ frequency estimate using the SA method
δ error in the two frequency estimates
K standard deviation of the signals in the ROI in the azimuth direction
κ index of the peak in K
References
1. Ahmed, E.; Yaqoob, I.; Gani, A.; Imran, M.; Guizani, M. Internet-of-things-based smart environments: State
of the art, taxonomy, and open research challenges. IEEE Wirel. Commun. 2016, 23, 10–16. [CrossRef]
2. Lazarescu, M.T. Design of a WSN platform for long-term environmental monitoring for IoT applications.
IEEE J. Emerg. Sel. Top. Circuits Syst. 2013, 3, 45–54. [CrossRef]
3. Lau, B.P.; Wijerathne, N.; Ng, B.; Yuen, C. Sensor fusion for public space utilization monitoring in a smart
city. IEEE Internet Things J. 2018, 5, 473–481. [CrossRef]
4. Roy, A.; Siddiquee, J.; Datta, A.; Poddar, P.; Ganguly, G. Smart traffic & parking management using IoT.
In Proceedings of the IEEE Information Technology, Electronics and Mobile Communication Conference,
Vancouver, BC, Canada, 13–15 October 2016; pp. 1–3.
5. Islam, S.R.; Rehman, R.A.; Khan, B.; Kim, B.S. The internet of things for health care: A comprehensive survey.
IEEE Access 2015, 3, 678–708. [CrossRef]
6. Latre, S.; Philip, L.; Tanguy, C.; Bart, B.; Pieter, B.; Piet, D. City of things: An integrated and multi-technology
testbed for IoT smart city experiments. In Proceedings of the IEEE International Smart Cities Conference,
Trento, Italy, 12–15 September 2016; pp. 1–8.
7. Dastjerdi, A.V.; Sharifi, M.; Buyya, R. On application of ontology and consensus theory to human-centric IoT:
An emergency management case study. In Proceedings of the 2015 IEEE International Conference on Data
Science and Data Intensive Systems, Sydney, NSW, Australia, 11–13 December 2015; Volume 11, pp. 636–643.
8. Sheng, Z.; Mahapatra, C.; Zhu, C. Recent advances in industrial wireless sensor networks towards efficient
management in IoT. IEEE Access 2015, 3, 622–637. [CrossRef]
9. Crommelinck, S.; Bennett, R.; Gerke, M.; Nex, F.; Yang, M.; Vosselman, G. Review of automatic feature
extraction from high-resolution optical sensor data for UAV-based cadastral mapping. Remote Sens. 2016, 8,
689. [CrossRef]
10. Puliti, S.; Talbot, B.; Astrup, R. Tree-stump detection, segmentation, classification, and measurement using
unmanned aerial vehicle (UAV) imagery. Forests 2018, 9, 102. [CrossRef]
11. Remondino, F. Heritage recording and 3D modeling with photogrammetry and 3D scanning. Remote Sens.
2011, 3, 1104–1138. [CrossRef]
12. Masuda, K.; Uchiyama, K. Robust control design for quad tilt-wing UAV. Aerospace 2018, 5, 17. [CrossRef]
13. Maza, I.; Fernando, C.; Capitán, J.; Ollero, A. Experimental results in multi-UAV coordination for disaster
management and civil security applications. J. Intel. Robot. Syst. Theory Appl. 2011, 61, 563–585. [CrossRef]
Sensors 2019, 19, 274 20 of 22
14. Wu, K. Target tracking based on a nonsingular fast terminal sliding mode guidance law by fixed-wing UAV.
Appl. Sci. 2017, 7, 333. [CrossRef]
15. Bai, G.; Liu, J.; Song, Y.; Zuo, Y. Two-UAV intersection localization system based on the airborne
optoelectronic platform. Sensors 2017, 17, 98. [CrossRef] [PubMed]
16. D’Oleire-Oltmanns, S.; Marzolff, I.; Peter, K.; Ries, J. Unmanned aerial vehicle (UAV) for monitoring soil
erosion in Morocco. Remote Sens. 2012, 4, 3390–3416.
17. Stöcker, C.; Eltner, A.; Karrasch, P. Measuring gullies by synergetic application of UAV and close range
photogrammetry-A case study from Andalusia, Spain. Catena 2015, 132, 1–11. [CrossRef]
18. Harwin, S.; Lucieer, A. Assessing the accuracy of georeferenced point clouds produced via multi-view
stereopsis from unmanned aerial vehicle (UAV) imagery. Remote Sens. 2012, 4, 1573–1599. [CrossRef]
19. Grenzdörffer, G.J.; Engel, A.; Teichert, B. The photogrammetric potential of low-cost UAVs in forestry and
agriculture. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 37, 1207–1213.
20. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review.
Precis. Agric. 2012, 13, 693–712. [CrossRef]
21. Honkavaara, E.; Kaivosoja, J.; Mäkynen, J.; Pellikka, I. Hyperspectral reflectance signatures and point clouds
for precision agriculture by light weight UAV imaging system. ISPRS Ann. Photogramm. Remote Sens. Spat.
Inf. Sci. 2012, 7, 353–358. [CrossRef]
22. Hunt, E.; Hively, W.; Fujikawa, S.; Linden, D.; Daughtry, C.; McCarty, G. Acquisition of NIR-green-blue
digital photographs from unmanned aircraft for crop monitoring. Remote Sens. 2010, 2, 290–305. [CrossRef]
23. Berni, J.; Zarco-Tejada, P.; Suárez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for
vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738.
[CrossRef]
24. Siebert, S.; Teizer, J. Mobile 3D mapping for surveying earthwork projects using an unmanned aerial vehicle
(UAV) system. Autom. Constr. 2014, 41, 1–14. [CrossRef]
25. Gevaert, C.; Sliuzas, R.; Persello, C. Opportunities for UAV mapping to support unplanned settlement
upgrading. In Proceedings of the GeoTech Rwanda, Kigali, Rwanda, 18–20 November 2015.
26. Lazarescu, M.T. Design and field test of a WSN platform prototype for long-term environmental monitoring.
Sensors 2015, 15, 9481–9518. [CrossRef]
27. Remondino, F.; Barazzetti, B.; Nex, F.; Scaioni, M.; Sarazzi, C. UAV photogrammetry for mapping and 3D
modeling-current status and future perspectives. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2011, 38,
C22. [CrossRef]
28. Francisco, A.V.; Fernando, C.R.; Patricio, M.C. Assessment of photogrammetric mapping accuracy based on
variation ground control points number using unmanned aerial vehicle. Meas. J. Int. Meas. Confed. 2017, 98,
221–227.
29. Agüera-Vega, F.; Carvajal-Ramírez, F.; Martínez-Carricondo, P. Accuracy of digital surface models and
orthophotos derived from unmanned aerial vehicle photogrammetry. J. Surv. Eng. 2017, 143, 04016025.
[CrossRef]
30. Koh, L.P.; Wich, S.A. Dawn of UAV ecology: Low-cost autonomous aerial vehicles for conservation.
Trop. Conserv. Sci. 2012, 5, 121–132. [CrossRef]
31. Jain, M. A next-generation approach to the characterization of a non-model plant transcriptome. Curr. Sci.
2011, 101, 1435–1439.
32. Van, B.; Harmonising, P. UAS Regulations and Standards; UAS Special Issue; GIM International: Lemmer,
The Netherlands, 2016.
33. VanWegen, W.; Stumpf, J. Bringing a New Level of Intelligence to UAVs-Interview with Jan Stumpf ; UAS Special
Issue; GIM International: Lemmer, The Netherlands, 2016.
34. Klare, J.; Biallawons, O.; Cerutti-Maori, D. UAV detection with MIMO radar. In Proceedings of the
International Radar Symposium, Prague, Czech Republic, 28–30 June 2017.
35. Zhang, H.; Cao, C.; Xu, L.; Gulliver, T.A. A UAV detection algorithm based on an artificial neural network.
IEEE Access 2018, 6, 24720–24728. [CrossRef]
36. Christof, S.; Maasdorp, F. Micro-UAV detection using DAB-based passive radar. In Proceedings of the 2017
IEEE Radar Conference (RadarConf), Seattle, WA, USA, 8–12 May 2017.
Sensors 2019, 19, 274 21 of 22
37. Biallawons, O.; Klare, J.; Fuhrmann, L. Improved UAV detection with the MIMO radar MIRA-CLE Ka using
range-velocity processing and TDMA correction algorithms. In Proceedings of the International Radar
Symposium, Bonn, Germany, 20–22 June 2018.
38. Jovanoska, S.; Brötje, M.; Koch, W. Multisensor data fusion for UAV detection and tracking. In Proceedings
of the International Radar Symposium, Bonn, Germany, 20–22 June 2018.
39. Ádám, S.; Rudolf, S.; Dániel, R.; Péter, R. Multilateration based UAV detection and localization.
In Proceedings of the International Radar Symposium, Prague, Czech Republic, 28–30 June 2017.
40. Ma’sum, M.A.; Arrofi, M.; Jati, G.; Arifin, F.; Kurniawan, M.; Mursanto, P.; Jatmiko, W. Simulation of
intelligent unmanned aerial vehicle (UAV) for military surveillance. In Proceedings of the International
Conference on Advanced Computer Science and Information Systems, Bali, Indonesia, 28–29 September
2013; pp. 161–166.
41. Nijim, N.; Mantrawadi, N. UAV classification and identification system by phenome analysis using data
mining techniques. In Proceedings of the 2016 IEEE Symposium on Technologies for Homeland Security
(HST), Waltham, MA, USA, 10–11 May 2016; pp. 1–5.
42. Mendis, G.J.; Tharindu, R.; Jin, W.; Arjuna, M. Deep learning based doppler radar for micro UAS detection
and classification. In Proceedings of the MILCOM 2016—2016 IEEE Military Communications Conference,
Baltimore, MD, USA, 1–3 November 2016; pp. 924–929.
43. Stolkin, R.; David, R.; Mohammed, T.; Ionut, F. Bayesian fusion of thermal and visible spectra camera data for
mean shift tracking with rapid background adaptation. In Proceedings of the IEEE Sensors, Taipei, Taiwan,
28–31 October 2012; pp. 1–4.
44. Witschi, M.; Schild, J.; Nyffenegger, B.; Stoller, C.; Berger, M.; Vetter, R.; Stirnimann, G.; Schwab, P.;
Dellsperger, F. Detection of modern communication signals using frequency domain morphological filtering.
In Proceedings of the 24th European Signal Processing Conference (EUSIPCO), Budapest, Hungary,
29 August–2 September 2016; pp. 1413–1417.
45. Gökçe, F.; Ucoluk, G.; Sahin, E.; Kalkan, S. Vision-based detection and distance estimation of micro unmanned
aerial vehicles. Sensors 2015, 15, 23805–23846. [CrossRef]
46. Fu, C.; Duan, R.; Kircali, D.; Kayacan, E. Onboard robust visual tracking for UAVs using a reliable global-local
object model. Sensors 2016, 16, 1406. [CrossRef]
47. Li, J.; Ye, D.; Chung, T.; Kolsch, M.; Wachs, J.; Boumanet, C. Multi-target detection and tracking from a single
camera in Unmanned Aerial Vehicles (UAVs). In Proceedings of the IEEE/RSJ International Conference on
Intelligent Robots and Systems, Daejeon, Korea, 9–14 October 2016; pp. 4992–4997.
48. Ritchie, M.; Francesco, F.; Hugh, G.; Børge, T. Micro-drone RCS Analysis. In Proceedings of the IEEE Radar
Conference, Johannesburg, South Africa, 27–30 October 2015; pp. 452–456.
49. Boucher, P. Domesticating the UAV: The demilitarisation of unmanned aircraft for civil markets.
Sci. Eng. Ethics 2015, 21, 1393–1412. [CrossRef] [PubMed]
50. Mohammad, S.S.; Osamah, A.R.; Daniel, N.A. Performance of an embedded monopole antenna array in a
UAV wing structure. In Proceedings of the IEEE Mediterranean Electrotechnical Conference, Valletta, Malta,
26–28 April 2010; pp. 835–838.
51. Liang, X.; Zhang, H.; Fang, G.; Ye, S.; Gulliver, T.A. An improved algorithm for through-wall target detection
using ultra-wideband impulse radar. IEEE Access 2017, 5, 22101–22118. [CrossRef]
52. Liang, X.; Zhang, H.; Ye, S.; Fang, G.; Gulliver, T.A. Improved denoising method for through-wall vital sign
detection using UWB impulse radar. Digit. Signal Process. 2018, 74, 72–93. [CrossRef]
53. Gorovoy, S.; Kiryanov, A.; Zheldak, E. Variability of Hydroacoustic Noise Probability Density Function at
the Output of Automatic Gain Control System. Appl. Sci. 2018, 8, 142. [CrossRef]
54. Liang, X.; Wang, Y.; Wu, S.; Gulliver, T.A. Experimental study of wireless monitoring of human respiratory
movements using UWB impulse radar systems. Sensors 2018, 18, 3065. [CrossRef] [PubMed]
55. Liang, X.; Deng, J.; Zhang, H.; Gulliver, T.A. Ultra-wideband impulse radar through-wall detection of vital
signs. Sci. Rep. 2018, 8, 13367. [CrossRef] [PubMed]
56. Xu, Y.; Wu, S.; Chen, C.; Chen, J.; Fang, G. A novel method for automatic detection of trapped victims by
ultrawideband radar. IEEE Trans. Geosci. Remote Sens. 2012, 50, 3132–3142. [CrossRef]
57. Xu, Y.; Dai, S.; Wu, S.; Chen, J.; Fang, G. Vital sign detection method based on multiple higher order cumulant
for ultrawideband radar. IEEE Trans. Geosci. Remote Sens. 2012, 50, 1254–1265. [CrossRef]
Sensors 2019, 19, 274 22 of 22
58. Liang, X.; Zhang, H.; Gulliver, T.A. A novel time of arrival estimation algorithm using an energy detector
receiver in MMW systems. EURASIP J. Adv. Signal Process. 2017, 83, 1–13. [CrossRef]
59. Liang, X.; Zhang, H.; Gulliver, T.A. Energy detector based TOA estimation for MMW systems using machine
learning. Telecommun. Syst. 2017, 64, 417–427. [CrossRef]
60. Liang, X.; Zhang, H.; Lu, T. Extreme learning machine for 60 GHz millimetre wave positioning. IET Commun.
2017, 11, 483–487. [CrossRef]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).