Target Geo-Localization Based On Camera Vision Simulation of UAV

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

J Opt

DOI 10.1007/s12596-017-0395-0

RESEARCH ARTICLE

Target geo-localization based on camera vision simulation of UAV


Prashanth Pai1 • V. P. S. Naidu2

Received: 20 July 2016 / Accepted: 28 March 2017


 The Optical Society of India 2017

Abstract This paper presents a simulation study on esti- Introduction


mating the Geo-Location of a target based on multiple
image of the target taken from a gimbaled camera mounted Unmanned aerial vehicles (UAV) are aircrafts without a
on a unmanned aerial vehicle (UAV), which orbits around human pilot aboard. UAV are preferably used in carrying
the target with a radius such that the target is always in the out mission considered to be ‘‘Dull, Dirty and Dangerous’’
field of camera vision. The Camera Vision Simulation of as suggested by Tice [1]. UAV application mainly involves
the UAV is implemented by using an ortho Geo-TIFF the sensors placed in the aerial vehicle for collecting the
(Geo-Spatial Tagged Information File Format) as imagery surrounding environment data which are of particular
reference, positional and attitude attributes of UAV, Gim- interest and sending it to Ground Control Station for per-
bal and Camera and internal characteristic of the simulated forming the necessary operations. Target Geo-Location is
Camera. Target is localized using the simulation images one such application, where the visual data captured from
taken from multiple bearing waypoints by applying the the mounted gimbal camera placed on UAV is sent to the
Geo-Location algorithm using the simulation parameters as Ground Control Station for determining the Geo-Location
reference. For improving the accuracy of the estimation, Coordinate of a Target Point of Interest which is present in
error reduction techniques like true average, moving the camera vision.
average and recursive least square are also suggested and This paper presents a simulation approach for validating
implemented. and verifying the Target Geo-Location Algorithm sug-
gested by Pratyusha [2]. The work done in the paper Target
Keywords Target Geo-Location  Multiple bearing  Geo-Location [2] by Pratyusha mainly involved determi-
Simulation  Geo-TIFF  Unmanned aerial vehicle (UAV)  nation of Target based on single bearing image data as
Error reduction technique reference. This paper suggests determining the geo-loca-
tion of a target using multiple bearing image data, and also
implementing error reduction techniques for improving the
accuracy in the result.
The focus of this paper is to create a vision simulation of
a gimbaled camera mounted on a UAV, based on the
application of Geo-Location algorithm. The simulation
& V. P. S. Naidu UAV model imitates to take a circular path around the
[email protected]
target with a particular orbital radius such that the target is
Prashanth Pai always under field of vision. The paper also focuses on
[email protected]
determining the Geo-Location of the target point of interest
1
Department of Engineering, University of Leicester, using the image generated from the simulation, and
Leicester, UK applying error reduction techniques for improving the
2
Multi Sensor Data Fusion Lab, CSIR-National Aerospace accuracy of the estimate.
Laboratories, Bangalore, India

123
J Opt

Estimation of circular waypoint Xv Xg, Xc

Xb (Heading)
UAV is considered in this simulation to take a circular path
around the target due to its ease of implementation and effi- Yb (Right
cient capability of tracking target [3]. The vision simulation of Xi (North) Wing)

a camera mounted on a UAV taking a circular path around a


given target with a known radius is realized by generating the
images to be taken only from the equally spaced circular Yv

waypoints in its path (Total of 36 equally spaced circular Yg, Yc


waypoints is considered in this study). Hence, estimation of
circular waypoint is essential to determine the location of
UAV from which the multiple bearing images will be taken for div : Distance from Centre Co-
the simulation. The circular waypoint locations are deter- Ordinate
Yi (East)
mined by the application of Haversine’s formula [4], as
mentioned in Eq. (1) with the knowledge of geo-location Fig. 1 Coordinate frames
coordinate of target, distance of separation (i.e. orbital radius)
and bearing angle of separation. Coordinate frames, camera intrinsic parameters such as
Camera Calibration Matrix and Image Depth.
Iterative 0 i0
Uav llði; 1Þ ¼ sin1 ðsinðt arg et latÞ  cosðdelÞ Co-ordinate frames
þ cosðt arg et lonÞ  sinðdelÞ  cosði  sÞÞ
Coordinate frames involved in transformation of 2D pixel
Uav llði; 2Þ ¼ t arg et lon þ arctan 2ðsinði  sÞ  sinðdelÞ
(Camera) coordinate to 3D real world (Inertial) coordinates
 cosðt arg et latÞ; cosðdelÞ  ðsinðtgt latÞ [5] are (Fig. 1):
 sinðmav llði; 1ÞÞÞ • Inertial Frame (Xi, Yi, Zi) Inertial frame or 3D real
ð1Þ world frame coordinate describes an object location in
real world coordinates with Xi describing its North
where, i is the sample number (iteration count or Bearing Angle position, Yi describing East position and Zi describing
index). Uav llði; 1Þ is the UAV latitude location in radians for the distance from centre of the Earth.
‘ith’sample, Uav llði; 2Þ is the UAV longitude location in • Vehicle Frame (Xv, Yv, Zv) Vehicle frame is co-linear
radians for ‘ith’sample, t arg et lat is the latitude coordinate of with Inertial Frame, with the origin located at the centre
the target considered in radians, t arg et lon is the longitude of mass of the plane.
coordinate of the target considered in radians, • Body Frame (Xb, Yb, Zb) The origin of the Body Frame is
del ¼ d=6371000, ‘d’ is the radius of UAV orbit, 6,371,000 m located at the centre of mass of the plane, with Xb, Yb, Zb,
is the Radius of the Earth, s ¼ 360=n is the fundamental bearing describing the vehicle nose, right wing and belly of the
angle considered, n is the total number of circular waypoints. plane.
• Gimbal Frame (Xg, Yg, Zg) Frame originates at the
Geo-location algorithm gimbal centre of rotation and is oriented so that Xg
describes the direction of optical axis, Zg describes the
The Simulation database creation (Synthetic Image Gen- width direction in the image plane, Yg describes the
eration) and the Target Localization using the simulation height direction in the image plane.
image generated, are both implemented in this study, based • Camera Frame (Xc, Yc, Zc) Camera Frames has the
on the Geo-Location Algorithm [2]. origin located at optical centre of the camera. Xc, Yc and
Geo-Location Algorithm basically relates the 2D pixel Zc describing the width direction in image plane, height
coordinate of an object present in image plane with its 3D direction in image plane and direction of optical axis of
world coordinate also called as inertial coordinate. In this the camera lens.
work we apply the Geo-Location Algorithm for determin-
ing the real world coordinate with the knowledge of pixel
Transformation matrices
coordinate of the point of interest. For applying the Geo-
Location Algorithm with the pixel coordinates of the point
The transformations matrix is important to relate the position
of interest, it is also essential to know the different Coor-
of the target point of interest in the camera frame coordinate
dinate frames, Transformation Matrices between
(image plane) to its position in the inertial frame coordinate

123
J Opt

(real world 3D location). Transformation matrix is defined where,


between adjacent relative Coordinate frames, and consists of 2 3
cel caz cel saz sel
translatory vector and rotational matrix. The transformation
Rgb ¼ Ry;ael Rz;aaz ¼ 4 sel caz 0 5
matrices used for determining the Target Geo-Location from sel caz sel saz cel
its image coordinate are mentioned below [5].
dbg ¼ ½Xbg ; Ybg ; Zbg T ¼ ½0; 0; 0T is the translatory vector. (In
Transformation from inertial to vehicle frame this simulation, the center of mass of body and gimbal is
considered to be the same),aaz denotes the azimuth angle of
As mentioned in ‘‘Coordinate frames’’ section Coordinate rotation about Zg and ael denotes the elevation angle of
frames, the inertial frame and vehicle frame are co-linear with rotation about Yg after aaz ,cel ¼ cos ael and saz ¼ sin aaz .
each other. Hence, transformation matrix Tvi can be defined by
translation between the origins of the frame as mentioned in Transformation from gimbal frame to camera frame
Eq. (2).
 
V I diV Transformation from gimbal frame to camera frame is used
Ti ¼ ð2Þ for aligning the image frame with respect to the gimbal
0 1
frame. In this experiment, camera orientation is kept such
where, that camera frame is collinear with gimbal frame, and the
2 3
xUAV gimbal center of mass and optical center is assumed to be
diV ¼ 4 yUAV 5 the same location. Hence, the transformation matrix Tgc is
hUAV given by as mentioned in Eq. (5).
 c 
xUAV and yUAV represents the East (Latitude) and North Rg dgc
(Longitude) Location of UAV calculated from GPS, hUAV Tgc ¼ ð5Þ
0 1
is the altitude of UAV.
where,
2 3
Transformation from vehicle frame to body frame 1 0 0
Rcg ¼ 4 0 1 05
Body frame provides the attitude of the UAV with respect 0 0 1
to the Vehicle Frame. Hence transformation matrix can be
defined using rotational angles of UAV, roll ðhÞ, pitch ðuÞ dgc ¼ ½Xgc ; Ygc ; Zgc T ¼ ½0; 0; 0T is the translatory vector
and yaw ðwÞ as mentioned in Eq. (3). between camera and gimbal frame.
 b 
v Rv 0
Tb ¼ ð3Þ Camera Calibration Matrix
0 1
where, The perspective transformation provides the relationship
2 3
ch cw c h sw sh between the position of target point of interest in camera
Rbv ¼ 4 su sh cw  cu sw su sh sw þ cu cw su c h 5 frame (Pixel coordinates) with its location in inertial frame
cu sh cw þ su sw cu sh sw  su cw cu ch (3D world coordinates). But this accounts only the external
parameters (positional orientation) not the internal char-
cu ¼ cos u and su ¼ sin u. acteristics of the camera, which also defines where a 3D
real world points would be mapped in the camera image
Transformation from body frame to gimbal frame frame. The internal characteristics of a camera can be
defined Camera Calibration Matrix ðCc Þ [6] as mentioned
Transformation from body to gimbal frame is given by the in Eq. (6).
translation vector dbg , which defines the distance of sepa- 2 3
ration between the two frames and rotational matrix Rgb , fx fh cx 0
6 0 f y cy 0 7
which defines the rotational orientation of gimbal with Cc ¼ 64 0 0 0 05
7 ð6Þ
respect to the UAV body. The rotational orientation can be
0 0 0 1
determined by using angles which defines the 2D orienta-
tion of a gimbal namely, pan (aaz ) and tilt (ael ). where, Cc is the Camera Calibration Matrix, fx and fy is the
 g  focal length measured along width pixel and height pixel of
g Rb dbg
Tb ¼ ð4Þ image plane in meters, fh is the skew of the pixel in meters,
0 1
cx and cy is the principal point of the camera in pixels.

123
J Opt

Image Depth The Target Geo-Location can also be determined by the


knowledge of parameters used to calculate Image Depth k
In order to compute the 3D world coordinate from the pixel by applying triangular law [7] in Fig. 3, as mentioned in
coordinate, estimating the transformation matrix which Eq. (9).
accounts for external parametric variation and camera matrix
Piobj ¼ ½Utmx; Utmy; Utmz; 1T ¼ Picc þ kðPi i
obj  Pcc Þ ð9Þ
which defines the intrinsic characteristic of a camera is essential.
Along with these two parameters, it is also required to measure where, Picc is the location of centre of camera measured in
an unknown parameter called Image Depth k, which gives the inertial frame and Piobj is the un-scaled geo-location coor-
information on the real world distance between camera centre dinate of the target.
and target point of interest along the optical axis (Fig. 2). The
Image Depth k can be measured [6] by using Eq. (7).
i Synthetic scene generation
Zcc
k¼ i  Z i
ð7Þ
Zcc obj
Synthetic Scene Generation or Simulation Image Generation
i
where, Zcc is the z component of is process of creating images from a simulated camera
i i c g b v 1 c
Pcc ,Pcc ¼ ½Tg Tb Tv Ti  Pcc is the location of center of model placed on a UAV, by using aerial Geo-TIFF
camera in inertial frame, Pccc ¼ ½ 0 0 0 1 T is the (Geospatial Tagged Image File Format) for imagery refer-
i
location of center of camera in camera frame, Zobj is the z ence. The Synthetic Scene Generation is achieved in this
c g b v 1 c
study by the application of Geo-Location Algorithm. The
component of Pi i
obj , Pobj ¼ ½Cc Tg Tb Tv Ti  Pcc is the un- algorithm is used for determining the Geo-Location Coor-
scaled location of the object in inertial frame. dinates of each and every pixels of the simulated camera
model by using the simulation model parameters defined for
Determining geo-location from pixel coordinate the UAV, Gimbal and the Camera which includes various
positional and attitude attributes, along with intrinsic
Geo-Location coordinate of the target point of interest parameter (For Camera only). The UAV Heading Angle
located at pixel coordinate q ¼ ð xip yip 1 1 ÞT in the parameter is chosen to be the negative value of Bearing
image plane can be determined by using the transformation Angle for providing a more realistic simulation (Fig. 3). The
matrices, camera matrix and Image Depth calculated [7] in obtained pixel raster matrix results of Geo-Location coor-
‘‘Transformation matrices’’ ‘‘Camera calibration matrix’’ dinate values are then filled with RGB imagery data value
‘‘Image depth’’ sections in Eq. (8). depending on the following conditions:
h i1
Piobj ¼ Cc  Tgc  Tbg  Tvb  TIv Kq ð8Þ • If the Image Depth k estimated for a given pixel in the
raster matrix is found to be zero, then the pixel is
where, Piobj Geo-Location coordinate of the target, K ¼ shaded with sky blue, indicating the sky region.
 
kI 0 • If the Geo-Location Coordinate value finds a pixel in
and k is the Image Depth.
0 1 the Aerial Geo-TIFF image having a matching Geo-
Location Coordinate value tagged with it, then the
-Zi
imagery data for the given pixel is filled using the
Yv matching pixel imagery data found in Geo-TIFF.
• If the Geo-Location Coordinate value in the raster
Yb
matrix doesn’t contain any matching pixel in Geo-TIFF
Xb
file having the same location coordinate value tagged,
Xv Centre of Mass, Gimbal & Camera is shaded with black, indicating that the imagery data
for the given geo-location is unknown.
The Matlab code snippet for generating Synthetic Image
λ
P i
cc
for a single bearing image is given below. uav, gimbal,-
Zv −i
Pobj
camera are structures defining parameters involved with
UAV, Gimbal and Camera, A, info, lattiff, lontiff provides
hvi
information related with Geo-TIFF file with latiff, lontiff
Target arrays providing latitude and longitude information for
i
Pobj defining the co-ordinate information of each pixel in Geo-
Xi- Yi
TIFF.
Fig. 2 Visualization of the Image Depth

123
J Opt

Xv
function out =
scenegenerate(uav,gimbal,cam,A,info,lattiff,lontiff) Xb
% lat,lon to utm conversion Ψ: Heading Angle
[utmx, utmy, zone] = ll2utm(uav.lat, uav.lon);
Ψ=-τ
% Inertial frame to Vehicle frame translation
vXi = utmx; vYi = utmy; vZi = uav.alt;
vTi = htransl(-vXi, -vYi, -vZi);
τ: Bearing Angle
% Vehicle to Body frame transformation
bTv = hrotx(deg2rad(-uav.phi)) * hroty(deg2rad(- Target Location
uav.theta)) * hrotz(deg2rad(-uav.psi));

% Body to Gimbal frame transformation


alpha = 180-gimbal.alpha; beta = gimbal.beta;
gTb = htransl(-gimbal.bXg, -gimbal.bYg, -gimbal.bZg)
* hrotx(deg2rad(-beta)) * hroty(deg2rad(-alpha));
Fig. 3 UAV Heading Angle variation for change in bearing angle
% Gimbal frame to Camera frame translation and
rotation
cTg = htransl(-cam.gXc, -cam.gYc, -cam.gZc) * Estimation of target geo-location
eye(4);

% Camera Calibration Matrix Target Geo-Localization is the process of estimating the


C = hcam(cam.f, cam.sx, cam.sy, cam.cx, cam.cy); real world location of a ground-based target present in the
x_ip = repmat(1:cam.w,1,cam.h);
y_ip = repmat(1:cam.h,cam.w,1); y_ip = y_ip(:)'; image taken from a camera placed in UAV, by applying
q = [x_ip; y_ip; ones(size(x_ip));
ones(size(x_ip))];
Geo-Location Algorithm. In this study the geo-location of a
i_pbar_obj = (C * cTg * gTb * bTv * vTi) \ q; target is estimated for multiple bearing image generated in
c_p_cc = [0; 0; 0; 1];
i_p_cc = (cTg * gTb * bTv * vTi) \ c_p_cc; simulation by applying the Geo-Location algorithm as
mentioned in Eq. (8). The same UAV, Gimbal and Camera
% Image Depth
lambda = i_p_cc(3)./(i_p_cc(3) - i_pbar_obj(3,:)); Parameters which were used for simulation of a given
Q = [lambda.*x_ip; lambda.*y_ip; lambda;
ones(size(lambda))];
bearing image. The Matlab code snippet for the deter-
mining Geo-Location of target present in bearing image is
%Geo-Location Estimation for each pixel in the Image
i_p_obj = inv(C * cTg * gTb * bTv * vTi) * Q;
given below. x_ip, y_ip describes the pixel coordinate of
[cam.lats, cam.lons] = the target in the image, cam, gimbal, uav describes the
utm2ll(i_p_obj(1,:),i_p_obj(2,:),zone*ones(size(x_ip
))); simulation parameter involved with the UAV, Gimbal and
xtif = round((cam.lons - Camera. Code for computing the Transformation and
info.BoundingBox(1,1))/info.PixelScale(1,1));
ytif = size(A,1) - round((cam.lats - Camera Matrix is not mentioned in snippet as the steps for
info.BoundingBox(1,2))/info.PixelScale(2,1)); computing are similar to the ones mentioned in simulation
ytif1 = round((cam.lats -
info.BoundingBox(1,2))/info.PixelScale(2,1)); image generation. i_p_cc, lambda, i_pbar_obj are also
% Generate Image by Mapping the Obtained Pixel Geo-
Location saved as output for further computations.
%Co-Ordinate with Geo-Tiff Information
for n = 1:length(x_ip)
function out=geolocation(x_ip,y_ip,cam,gimbal,uav)
% If Image Depth is negative, shade pixel with blue %
if lambda(n) <= 0 Code Space for Transformation Matrix Computation and
cam.img(cam.h+1-y_ip(n),cam.w+1-x_ip(n),:) = Camera Matrix Computation remains the same as
[79, 110, 176]; mentioned in Scene Generation
%
% Pixel with unavailable imagery data shaded with %Unscaled Object Target Location in Inertial Frame
black out.i_pbar_obj = (C * cTg * gTb * bTv * vTi) \ q;
elseif (xtif(n)<=0 || ytif(n)<=0 || c_p_cc = [0; 0; 0; 1];
xtif(n)>info.Width || ytif(n)>info.Height)
cam.img(cam.h+1-y_ip(n),cam.w+1-x_ip(n),:) = %Camera Center Location w.r.t Inertial Co-Ordinates
[0, 0, 0]; out.i_p_cc = (cTg * gTb * bTv * vTi) \ c_p_cc;
%Imagery data taken from matching pixel of Geo-TIFF
else % Image Depth
cam.img(cam.h+1-y_ip(n),cam.w+1-x_ip(n),:) = out.lambda = out.i_p_cc(3)./(out.i_p_cc(3) -
A(ytif(n),xtif(n),:); out.i_pbar_obj(3,:));
cam.imglats(cam.h+1-y_ip(n)) =
lattiff(ytif(n)); % Target Geo-Location Estimation
cam.imglons(cam.w+1-x_ip(n)) = if out.lambda > 0
lontiff(xtif(n)); Q = [out.lambda.*x_ip; out.lambda.*y_ip;
end out.lambda; ones(size(out.lambda))];
end i_p_obj = inv(C * cTg * gTb * bTv * vTi) * Q;
out=cam; [out.lat, out.lon] =
utm2ll(i_p_obj(1,:),i_p_obj(2,:),zone);
end

123
J Opt

Target Location and MAV Location


77.6522
Target Location
MAV Locations in Circular Path
77.652

77.6518
Longitude (deg)

77.6516

77.6514

77.6512

77.651

77.6508
12.9626 12.9628 12.963 12.9632 12.9634 12.9636 12.9638

Latitude (deg)

Fig. 4 Circular waypoint considered in the Simulation

Error reduction techniques UTMy(i) is the x and y UTM Coordinate true value com-
puted for ith sample number.
The estimation of geo-location coordinate calculated for
each images taken from different bearing angles are bound Moving average estimation
to be error prone even if ideal conditions are considered in
the simulation. The error in simulations like in this study Moving Average Estimation is determining the mean of
can be mainly due to the inaccurate localization of pixel subset of value, where only a few recent determined from
coordinate, which can be due to either, incorrect selection bearing sample number location of x and y UTM coordi-
of pixel coordinate or target being located at integer pixel nate values are considered (in this study, 5 recent values
element, or both the criteria. Also, when a practical real chosen) for computing the mean of the subset. Moving
time experiment is considered, the error in estimation can Average computes only subset of values as mentioned in
increase due to the uncertain parametric environment cre- Eq. (11), in contrast with True Average value which con-
ated by different error bias [8]. Hence for reducing the siders the entire dataset.
error observed in the estimation of geo-location coordi-
1 X k
nates, localization methods based on multiple bearing Utmx mavgðkÞ ¼ UtmxðiÞ
w i¼kwþ1
images can be considered. Averaging is one such method, ð11Þ
where results of multiple bearing images are considered to 1 X k

reduce localization error. The Averaging methods used in Utmy mavgðkÞ ¼ UtmyðiÞ
w i¼kwþ1
the paper are.
where, Utmx_mavg(k), Utmy_mavg(k) is the Moving
Average estimation Average Value determined for the kth Sample Number, w is
the window size for the Moving Average.
Average Estimation is determined by computing the
arithmetic mean of estimation as shown in Eq. (10). Recursive least square estimation
1X n
Utmx tavg ¼ UtmxðiÞ Recursive least square (RLS) is a simple method of
n i¼1
ð10Þ recursively fitting a set of points to some function of choice
1X n
by minimizing the sum of the squares of the offset of the
Utmy tavg ¼ UtmyðiÞ
n i¼1 points. The results obtained using RLS is identical to the
true average, but the process of obtain it is more efficient.
where, Utmx_tavg, Utmy_tavg is the True Average Value Another benefit of using RLS is that it provides intuition
of target x and y coordinate in UTM coordinates, n is the behind such results as the Kalman filter. The Matlab code
total sample number (n = 36 in this study), UTMx(i), snippet used for determining estimation from RLS

123
J Opt

Table 1 UAV Circular waypoint Geo-Location coordinates calcu- A_N=[];


lated using Haversine’s Formula for j=1:n
nres=sprintf('FrameRes%d.mat',j*angle);
Target Geo-Location coordinate: Latitude: 12.963021580681730 nrest=strcat(pathname_tgl,'\',nres);
estimate=load(nrest);
Longitude: 77.651407302614828
%Camera Center Location w.r.t Inertial Co-Ordinates
i_p_cc=estimate.Current_UavLocation.output_target.i_
UAV orbit radius: 50 m
p_cc;
Sample Bearing angle UAV location coordinate %Image Depth
lambda=estimate.Current_UavLocation.output_target.la
number (‘i’) (‘s’) (in degrees) mbda;
Latitude Longitude
%Unscaled Object Target Location in Inertial Frame
(degrees) (degrees) i_pbar_obj=estimate.Current_UavLocation.output_targe
t.i_pbar_obj;
1 10 12.96355298 77.65150345 %Compute a_n1,b_xn1,b_yn1 values
2 20 12.96352863 77.65159668 a_n1=eye(1);
b_xn1=i_p_cc(1)+(lambda.*(i_pbar_obj(1)-i_p_cc(1)));
3 30 12.96348888 77.65168416 b_yn1=i_p_cc(2)+(lambda.*(i_pbar_obj(2)-i_p_cc(2)));
4 40 12.96343493 77.65176322 % Determine A_N,b_xn,b_yn,P_n values
if(isempty(A_N))
5 50 12.96336842 77.65183147 A_N= [a_n1];
6 60 12.96329138 77.65188682 b_xn= [b_xn1];
b_yn= [b_yn1];
7 70 12.96320613 77.65192761 P_n=inv((A_N)'*A_N);
8 80 12.96311528 77.65195259 % Determine the X and Y UTM Co-Ordinate of Target
X_n1(j)=P_n*(A_N)'*b_xn;
9 90 12.96302158 77.65196101 Y_n1(j)=P_n*(A_N)'*b_yn;
10 100 12.96292788 77.65195259 else
% Determine A_N,b_xn,b_yn,P_n values
11 110 12.96283703 77.65192761 P_n1= P_n - ((P_n*(a_n1)'*(a_n1)*P_n)/(1 +
12 120 12.96275178 77.65188682 (a_n1*P_n*(a_n1)')));
A_n1=[A_N' a_n1]';
13 130 12.96267474 77.65183146 b_xn1=[b_xn' b_xn1]';
14 140 12.96260823 77.65176322 b_yn1=[b_yn' b_yn1]';
% Determine the X and Y UTM Co-Ordinate of Target
15 150 12.96255428 77.65168415 X_n1(j)=P_n1*(A_n1'*b_xn1);
Y_n1(j)=P_n1*(A_n1'*b_yn1);
16 160 12.96251453 77.65159668
P_n=P_n1;
17 170 12.96249019 77.65150345 A_N=A_n1;
b_xn=b_xn1;
18 180 12.96248199 77.6514073
b_yn=b_yn1;
19 190 12.96249019 77.65131115 end
end
20 200 12.96251453 77.65121793
21 210 12.96255428 77.65113045
22 220 12.96260823 77.65105139
Table 2 Simulation model parameters considered for image
23 230 12.96267474 77.65098314 generation
24 240 12.96275178 77.65092778
Sample number location i = 1,2,…,36.
25 250 12.96283703 77.65088699 Bearing angle s = 360/n = 360/36 = 10
26 260 12.96292788 77.65086201
UAV parameters Gimbal Camera parameters
27 270 12.96302158 77.6508536
parameters
28 280 12.96311528 77.65086201
29 290 12.96320613 77.65088699 Latitude: Uav_ll(i,1) Pan (a): 0 w x h: 640 9 480 pixels
(in degrees)
30 300 12.96329138 77.65092778
Longitude: Uav_ll(i,2) Tilt (b): 0 fx: 142.8571428571 pixel units
31 310 12.96336842 77.65098314
(in degrees)
32 320 12.96343493 77.65105139
Altitude: 60 m Xbg : 0 m fy: 142.8571428571 pixel units
33 330 12.96348888 77.65113045
Heading (w): -(i*s) Ybg : 0 m cx: 320.5 pixel
34 340 12.96352863 77.65121792 (degrees)
35 350 12.96355298 77.65131115 Roll (h): 0 Zbg : 0 m cy: 240.5 pixel
36 360 12.96356117 77.6514073 Pitch (u): 0 Xgc ,Ygc ,Zgc : 0 m

Algorithm is mentioned below. Here RLS Functions A_n, Results and discussions
A_n1,b_xn, b_xn1, b_yn, b_yn1, P_n and P_n1 are deter-
mined to compute X_n1(j) and Y_n1(j), the x and y target The Simulation setup for the Unmanned Aerial Vehicle
coordinate value in inertial coordinate for the jth bearing (UAV) equipped with a camera, orbiting around target is
sample number considered. realized using Matlab. For this study, the target is chosen to

123
J Opt

Fig. 5 UAV Simulation Images


generated from different bearing
angle Location (bearing angle
(s): Top Left 10, Top Right 90,
Bottom Left 180, Bottom Right
270)

Target Location and MAV Location Target Location and MAV Location
77.6522
Target Location Target Location
MAV Locations in Circular Path MAV Locations in Circular Path
77.652 77.65142
Estimates from MAV Location Estimates from MAV Location

77.6518 77.651415
Longitude (deg)

Longitude (deg)

77.6516 77.65141

77.6514
77.651405

77.6512
77.6514

77.651
77.651395

77.6508
12.9626 12.9628 12.963 12.9632 12.9634 12.9636 12.9638 12.963015 12.96302 12.963025 12.96303 12.963035

Latitude (deg) Latitude (deg)

Fig. 6 Estimation of Target Geo-Location coordinate from different bearing angle

Fig. 7 Error in estimation of Estimation of Error in determination of target using Geo-Location Algorithm
3
Geo-Location for each bearing
sample number considered
2.5
Error Estimation in mts.

1.5

0.5

0
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37

MAV Sample Numbers

123
J Opt

Target Location and MAV Location Target Location and MAV Location
77.6526
Target Location Target Location
MAV Locations in Circular Path MAV Locations in Circular Path
77.6524
Estimates from MAV Location Estimates from MAV Location
True Average Estimate 77.65142 True Average Estimate
77.6522
Moving Average Estimates Moving Average Estimates
RLS Estimates RLS Estimates
77.652

Longitude (deg)
Longitude (deg)

77.651415
77.6518

77.6516

77.65141
77.6514

77.6512

77.651405
77.651

77.6508
12.9626 12.9628 12.963 12.9632 12.9634 12.9636 12.9638 12.963009 12.963019 12.963029 12.963039 12.963045
Latitude (deg) Latitude (deg)

Fig. 8 Estimation of Geo-Location from different error reduction techniques

be NAL overhead Water Tank with Geo-Location Coor- The simulation images are generated for each of the 36
dinate of Latitude 12.963021580681730 and Longitude equally spaced waypoints using the Matlab code as men-
77.651407302614828 as per the Google Earth Geo-Co- tioned in ‘‘Synthetic scene generation’’ section. The UAV,
ordinate database. The simulation UAV model is imitated Camera and Gimbal Parameters used for simulation for
to orbit the target with an altitude of 60 m and orbit radius each bearing angle s is (Table 2). Synthetic Image gener-
of 50 m, so that the target chosen always remains under the ated for some bearing angles are displayed in (Fig. 5).
camera field of view. Simulation Images are generated for The Target Geo-Location is estimated for each of the
36 equally spaced circular waypoints (Fig. 4). The UAV images generated from the 36 equally spaced circular
Circular waypoint location calculated using Haversine waypoint, by applying Geo-Location Algorithm using the
Formula for each bearing angle with respect to target user defined target pixel coordinate location and simulation
(Table 1). parameter involved (Fig. 6).

123
J Opt

Fig. 9 Estimation error Performance Comparision of Various Error Reduction Technique


3
comparison of different error
True Value Error
reduction techniques
True Average Value Error

2.5 Moving Average Value Error


RLS Value Error

Estimation Error in mts.


1.5

0.5

0
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
Sample Numbers

Errors for each bearing angle true estimates are deter- References
mined and a bar graph is plotted (Fig. 7). The average
estimation, moving average estimation and RLS estimation 1. BP. Tice, Unmanned aerial vehicles:the force of the multiplier of
the 1990s, Airpower J. V (1), (1991) https://web.archive.org/web/
techniques suggested in ‘‘Error reduction techniques’’
20090724015052/,Accessed on 19 Febraury 2016
section are implemented for reducing the error. The esti- 2. PL Pratyusha, Estimation of ground Target Geo-Location using
mations from various techniques used are plotted against UAV onboard camera, M.Tech thesis, Department of Avionics,
target (Fig. 8) and errors from different estimation are IST, JNTU, Kakinada, July 2015
3. TH. Summers, MR Akella, MJ Mears, Coordinated standoff
compared with a line graph (Fig. 9).
tracking of moving target: control law and information architec-
tures, J. Guid. Control. Dyn 32(1), 56–69 (2009). http://arc.aiaa.
org/doi/abs/10.2514/1.37212, Accessed on 18 February, 2016
Conclusion 4. C. Veness, Calculate distance, bearing and more between Latitude/
Longitude points, http://www.movable-type.co.uk/scripts/latlong.
html, Accessed on 19 Febraury, 2016
The Geo-Location algorithm is used for generating Syn- 5. R.W. Beard, T.W. McLain, Coordinate frames, small unmanned
thetic Scenery Image for each of the 36 equally spaced aircraft theory and practice (Princeton University Press, New
circular waypoint calculated using Haversine Formula. The Jersey, 2012), pp. 8–18
6. Y. Ma, S. Soatto, J. Kosecka, S.S. Sastry, An invitation to 3-D
target is first estimated by considering each and every
vision: from images to geometric models (Springer, Berlin, 2004)
bearing image independently, and later error reduction 7. J.D. Redding, T.W. McLain, R.W. Beard, C.N. Taylor, Vision-
techniques are applied by considering multiple bearing based target localization from a fixed wing miniature air vehicle, in
images. From the results obtained in Geo-Location esti- Proceedings of the 2006 American control conference, Min-
neapolis, 2006. p.2862–2867
mation using various error reduction techniques, it is clear
8. B. Barber, TW. McLain, B. Edwards, Vision-based landing of a
that the RLS technique is more reliable with the estimation fixed-wing miniature air vehicle, J. Aerosp. Comput. Inf. Com-
error constantly remaining under 0.8 m error, when com- mun. (2009), https://www.researchgate.net/publication/
pared with true estimation, where the peak error reading for 245439885, DOI: 10.2514/1.36201, Accessed on 19 February,
2016
a sample found to be nearly 2.6 m. For future work, the geo
location algorithm and the error reduction techniques need
to be practically implemented to test its efficiency in real
time.

123
J Opt

Prashanth Pai V. P. S. Naidu

123

You might also like