Naval Postgraduate School: Monterey, California
Naval Postgraduate School: Monterey, California
POSTGRADUATE
SCHOOL
MONTEREY, CALIFORNIA
THESIS
by
December 2008
11. SUPPLEMENTARY NOTES The views expressed in this thesis are those of the author and do not reflect the official policy
or position of the Department of Defense or the U.S. Government.
12a. DISTRIBUTION / AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE
Approved for public release; distribution is unlimited.
13. ABSTRACT (maximum 200 words)
SeaFox is an unmanned surface vehicle (USV) primarily used for maritime security operations. Currently, a remotely
operated vision based camera is used to track a particular target whilst the USV approaches the intended target. While
the USV is in motion, the hydrodynamic forces and mechanical vibrations makes it difficult for the operator to lock
on to the target at all times.
This thesis addresses this issue through the development of a self compensated motion controller that uses geo-
pointing to track and lock onto a target at all times. The disturbance data as captured by the onboard IMU sensor is
used to establish parameters for the compensator. The compensated pan tilt angles are fed to the vision based camera
through a PID controller.
The controller developed will enable the vision based camera system to autonomously track the intended target
independently of the motion of the USV.
14. SUBJECT TERMS Unmanned Surface Vehicle, USV, Geo-pointing, Compensator, Vision based 15. NUMBER OF
camera, Autonomous tracking. PAGES
51
16. PRICE CODE
17. SECURITY 18. SECURITY 19. SECURITY 20. LIMITATION OF
CLASSIFICATION OF CLASSIFICATION OF THIS CLASSIFICATION OF ABSTRACT
REPORT PAGE ABSTRACT
Unclassified Unclassified Unclassified UU
NSN 7540-01-280-5500 Standard Form 298 (Rev. 2-89)
Prescribed by ANSI Std. 239-18
i
THIS PAGE INTENTIONALLY LEFT BLANK
ii
Approved for public release; distribution is unlimited.
from the
Knox T. Millsaps
Chairman, Department of Mechanical and Astronautical
Engineering
iii
THIS PAGE INTENTIONALLY LEFT BLANK
iv
ABSTRACT
v
THIS PAGE INTENTIONALLY LEFT BLANK
vi
TABLE OF CONTENTS
I. INTRODUCTION........................................................................................................1
A. BACKGROUND ..............................................................................................1
B. PROBLEM FORMULATION .......................................................................2
C. OBJECTIVE ....................................................................................................3
D. SCOPE ..............................................................................................................3
II. SYSTEM ARCHITECTURE .....................................................................................5
A. SYSTEM OVERVIEW ...................................................................................5
B. COORDINATE REFERENCE FRAMES AND
TRANSFORMATIONS ..................................................................................6
1. Inertial Coordinate Reference Frame................................................6
2. Body Coordinate Reference Frame....................................................6
3. Gimbal Reference Frame ....................................................................6
4. Coordinate Transformations and Rotation Matrices.......................7
a. Euler Angles [3] ........................................................................7
b. Rotation Matrices......................................................................7
C. GEO-POINTING .............................................................................................8
D. PAN TILT CAMERA SYSTEM ..................................................................10
1. Camera Hardware Description ........................................................10
2. Camera Software and Communication Protocol ............................11
E. SENSORS .......................................................................................................12
1. IMU Sensor.........................................................................................12
2. GPS......................................................................................................12
III. SYSTEM MODEL .....................................................................................................13
A. OVERVIEW...................................................................................................13
B. GEO-POINTING MODEL ...........................................................................14
C. DISTURBANCE MODEL ............................................................................15
D. PAN TILT CAMERA MODEL....................................................................17
E. PAN TILT COMMAND MODEL ...............................................................18
IV. SIMULATIONS AND RESULTS ............................................................................21
A. EXPERIMENTAL SET UP ..........................................................................21
B. SIMULINK SIMULATIONS AND RESULTS ..........................................22
1. IMU Data Verification.......................................................................22
2. Target Tracking .................................................................................24
a. Target Tracking without Disturbance....................................25
b. Target Tracking with Disturbance .........................................25
V. CONCLUSIONS AND RECOMMENDATIONS...................................................29
A. CONCLUSIONS ............................................................................................29
B. RECOMMENDATIONS...............................................................................29
APPENDIX.............................................................................................................................31
A. IMU PARAMETERS ....................................................................................31
vii
B. MATLAB FUNCTIONS ...............................................................................32
1. MATLAB function: Alticam_cmd.m ...............................................32
LIST OF REFERENCES ......................................................................................................33
INITIAL DISTRIBUTION LIST .........................................................................................35
viii
LIST OF FIGURES
ix
THIS PAGE INTENTIONALLY LEFT BLANK
x
LIST OF TABLES
xi
THIS PAGE INTENTIONALLY LEFT BLANK
xii
ACKNOWLEDGMENTS
I would like to thank Professor Anthony Healey for his kind guidance and
immense contributions to this thesis. He has been a great source of inspiration and
knowledge.
I would also like to give special thanks to Sean Kragelund for his contribution to
the software implementation, Simulink model building tips and help in constructing the
bench top that made possible the research work.
xiii
THIS PAGE INTENTIONALLY LEFT BLANK
xiv
I. INTRODUCTION
A. BACKGROUND
The ScanEagle uses its electro optical camera to locate the target vessel location.
This position is then located by computation on the surface of the earth. This point known
as the Sensor Point of Interest (SPOI) is then relayed to the SeaFox in real time. Based on
the SPOI coordinates, the SeaFox will estimate the target vessel location and starts the
1
pursuit. During the pursuit and subsequent tracking of the target vessel, the daylight
camera on the SeaFox provides imaging to the operator via a wireless network. This is
shown in Figure 2. .
Target Target
SeaFox
SeaFox
a) b)
Target
SeaFox
c) d)
B. PROBLEM FORMULATION
The experiment served as a successful proof of concept. The SeaFox was able to
successfully track the target vessel. However, the experiment showed that there were two
areas which required improvement and further study.
The first problem identified was the large variation in the SPOI data. The high
noise in the data was attributed to the operating area and the limitation imposed on the
2
height of operation for the UAV. The noise in the SPOI data meant that small
adjustments made in the UAVs camera pointing direction translated into larger target
position changes on the surface.
The second problem indentified was that the imaging provided by the camera
onboard the SeaFox was unstable. This problem was mainly attributed to the wave
motion which the SeaFox was subjected to as it pursued and tracked the target. The wave
motion was in turn transmitted to the camera which resulted in unstable imaging for the
operator.
From the above problems, it became clear that the SeaFox USV should not rely
entirely on the target data provided by the UAV. It was desired that the USV use the
initial target location provided by the UAV and thereafter pursue and track the target with
its own sensors. It was also desired to have a stable video imaging for proper target
inspection. This meant that the camera system will require some sort of compensator
system to cancel the wave effects that causes destabilization of the video imaging.
This thesis looks into the above areas of research, providing the USV a means of
tracking the target vessel based on initial target coordinates as well as to be able to cancel
out the wave effect for stable video imaging.
C. OBJECTIVE
The objective of this research is to design and develop a camera control system
that is able to point the pan-tilt camera unit at the target in real time, given the geodetic
location of the target and taking into account the movements of the boat and the external
disturbances from the water waves
D. SCOPE
3
Computation of the required pointing angles to track target and to cancel
wave motion in the camera.
4
II. SYSTEM ARCHITECTURE
A. SYSTEM OVERVIEW
The overall system is divided into two segments for ease of design. The first
segment looks at the IMU data that the system reads in real time. The data from the IMU
provides the accelerations of the USV body in three orthogonal axes as well as rotational
rates about these axes. The overall system architecture is shown below in Figure 3. The
rotational rates from the IMU is corrected for the rotation of the earth. These rate are then
used to compute the Euler angles with reference to the USV body frame. The Euler
angles computed reflect the effect of the waves directly on the body of the USV.
The second segment calculates the pointing angles required by the camera and
sends the required commands to pan and tilt the camera so as to obtain the proper view of
the target. As shown in Figure 3. , the camera pan and tilt commands are computed from
the geodetic location of the target and the USV. The latitude, longitude and altitude of
5
both the target and the USV are converted to coordinates in the local tangent place, which
in this case is taken as the North-East-Down(NED) coordinate system. Once the
coordinates are calculated in the NED frame, the relative vector of the target with respect
to the USV is computed. This relative vector is then further transformed into the relative
vector in the USV body frame coordinate system. Knowing the relative vector of the
target with respect to the USV, the required command pan and tilt angles are then
computed for the camera to point to the target having taking into account the disturbances
of the waves on the camera.
The local tangent plane is used to represent the inertial coordinate system in this
report. In using the local tangent plane, a necessary assumption of a flat earth has been
made. This is a valid assumption, as the target and the USV are within an area where the
curvature of the earth does not come into play. North-east-down (NED) is used as right
handed orthogonal axes to represent the coordinate system in the local tangent plane. The
north corresponds to the x-axis, the east corresponds to the y-axis and down corresponds
to the z-axis. The north axis aligns with the Northing of the earth; east axis aligns with
the Easting of the earth, and down points downward to the center of the earth.
The USV body coordinate frame is a right handed orthogonal system with its
origin at the center of gravity of the USV. The x-axis of the frame is aligned with the fore
direction of the USV, the y-axis is aligned towards the starboard side of the USV and the
z-axis is aligned downward towards the center of the earth.
6
the intersection point of the axis of pan and the axis of tilt. The x-axis ,y-axis and the z-
axis are aligned with the respective axes of the body frame. For simplification, the origin
of the gimbal reference frame and the origin of the body frame is taken to be at the same
location.
b. Rotation Matrices
The orientation of the USV in any frame is defined by its position, given
by a [3X1] position vector, and its angular orientation, given by a [3X3] rotation matrix.
The rotation matrices for rotation about each axis is given by the following equations [2]:
x x i cos sin 0 xi
y = R i
y = sin cos 0 yi (1)
z z i 0 0 1 z i
x x cos 0 sin x
y = R y = 0 1
0 y (2)
z z sin 0 cos z
7
xb x 1 1 0 x
b y = 1 cos
y = R sin y (3)
zb
z 0 sin cos z
The above three equations are combined to form a single rotational matrix
to transform coordinates from the inertial frame to the body frame. This is shown in the
following equation [4]:
cos cos sin cos sin
B
I R = R R R = cos sin sin sin cos sin sin sin + cos cos cos sin cos (4)
cos sin sin + sin sin sin sin cos cos sin cos
Using the rotational matrix, the coordinates in the body frame is obtained
as in the following equation:
xb xi
b B i
y = I Ry (5)
zb zi
C. GEO-POINTING
The relative vector of the target vessel with respect to the USV is termed as geo-
pointing in this report. Geo-pointing is obtained in the local tangent plane using the N-E-
D coordinates. In order for geo-pointing to take place, the latitude, longitude of both the
target and the USV must be known. In addition, the latitude and longitude of a point of
origin taken to be within the vicinity of both the target and the USV must be known.
Using the above mentioned three points, a relative vector of the target with respect to the
USV is obtained as illustrated in Figure 4.
8
Figure 4. Relative vector of target with respect to USV in NED frame coordinates
However, the relative vector as obtained in the local tangent plane will not give
the true angle that the camera will need to point to so as to track the target. The heading
of the USV has to be taken into account so that the true angle of pointing may be
obtained. In order to do this, the relative vector of the target with respect to the USV must
be obtained in USV body frame of coordinates. Figure 4 below shows the true pan angle,
, that will be obtained using the relative vector in the USV body frame coordinates.
Figure 5. Relative vector of target with respect to USV in body frame coordinates
9
D. PAN TILT CAMERA SYSTEM
The camera system that is used in the USV is the turret mounted Alticam 400[5].
This is a small lightweight turret mounted camera system that is inertially stabilized. This
allows the camera to point accurately at the target independent of the position and
orientation of the USV. The camera incorporates advanced stabilization logic that is able
to filter the vibrations of the USV.
The E-640 electro optics camera onboard uses a 640 x 480 pixel color CCD
sensor. The camera is able deliver video at 30 frames per second and has a 25x optical
zoom. Figure 6. shows the Alticam camera system.
The Alticam camera system has a bandwidth of 20 Hz. This limits the rate at
which commands can be sent to the camera system. Table 1below shows the hardware
communication settings for the camera system.
Type SW bps Selectable? Data Stop bit Parity HW
version control
AltiCam 04 195 57,600 NO 8 1 N None
10
2. Camera Software and Communication Protocol
The Alticam camera system uses the Insitu Seascan protocol for the purpose of
communication. Messages to and from the camera system are composed as follows:
The header field which contains 9 bytes is necessary as it allows the Alticam
camera system to synchronize with the incoming message. The header fields are always
the same for messages being sent to the camera system. Table 2 shows an example of the
header that is used in this thesis. The data shown in the table is in hexadecimal.
H0 H1 H2 H3 H4 H5 H6 H7 H8
0x55 0xAA 0x07 0x4D 0x00 0x00 0x00 0x00 0x00
The data field for the camera system made up of 7 bytes contains commands that
are sent to the camera for it to perform a particular function. In this thesis, the commands
being sent to the camera are the pan angle command and the tilt angle command. The
syntax and the required protocol is shown in Table 3 below.
D0 D1 D2 D3 D4 D5 D6
47 51 00 op 00 00 uv
11
10000
op = cmd + 128 (7)
256
10000
uv = cmd + 128 (8)
256
The values as evaluated by the equations above will then have to be converted to
hexadecimal before sending to the camera system. The syntax for other commands can be
found in [5].
E. SENSORS
1. IMU Sensor
The Inertial Measurement Unit (IMU) used in this thesis is the Honeywell HG
1700. The HG1700 shown in figure 5 is a low cost ring laser gyroscope based IMU.
The IMU outputs the rotational rates in the inertial frame, p, q and r. these rates
measures the rotation of the body with respect to the orthogonal x,y and z axes. The IMU
also outputs the acceleration in the three orthogonal axes, Ax, Ay and Az. The IMU is
mounted at the center of gravity of the USV so that it able to measure the rotation of the
USV about its center of gravity. A detailed of the IMU parameter list is attached in
appendix A.
2. GPS
The Global Positioning System(GPS) on board the Seafox USV provides its
latitude and longitude required for the calculation of the camera pointing angles.
12
III. SYSTEM MODEL
A. OVERVIEW
The main focus of this thesis was to implement control systems in Simulink.
Figure 8. shows the Simulink model of the overall system.
The geo-pointing block computes the relative vector of the target with respect to
the USV in the inertial frame based on the latitude, longitude and altitude of the USV and
the target. The output of this model is then passed onto the pan tilt command block.
The disturbance model computes the disturbances due to the waves on the USV.
This model reads the IMU data which measures the disturbances on the USV. The output
of this block, the Euler angles in the body frame, is passed on to the pan tilt command
block
13
The pan tilt command block mainly computes the required pan and tilt angles
required to track the target, taking into account disturbances on the USV. The output of
this block, pan command angle, and tilt command angle is passed onto the pan tilt camera
model.
The pan tilt camera model uses the pan and tilt command angles, and converts it
into a message for communication with the camera system.
In order for the model to read data in real time, the RTBlock is used to ensure the
simulation runs in real time. The simulation is run with a step size of 0.01s.
B. GEO-POINTING MODEL
The geo-pointing model computes the relative vector of the target with respect to
the USV. Figure 9. shows the structure of the geo-pointing model.
Figure 9. Simulink model to compute relative vector of target to USV in inertial frame
The latitude, longitude and altitude is used to compute the NED coordinates of the
USV and target respectively. This model requires a point of origin. The origin point is
taken to be within the vicinity of the operation so as to reduce localization errors. The
difference between the target NED coordinates and the USV NED coordinates produces
the required relative vector in the inertial frame.
14
C. DISTURBANCE MODEL
The disturbance model basically measures the disturbances on the USV and thus
the camera platform using the onboard IMU. The Simulink model of the disturbance
model is shown in Figure 10.
The disturbance measurement data from the IMU is read by a RS232 blockset.
The data is read in binary form via a RS232 port. The output of the IMU is then
converted to decimal values. Due to vibrations, the signal is noisy and thus a low pass
third order butterworth filter is used to produce a smoother signal. The model is shown
below in Figure 11.
15
Figure 11. IMU data acquisition Simulink model
The acceleration values, Ax and Ay, from the IMU is used to compute the initial
gravity based roll, grav, and pitch angle, grav. The computations of these angles are
shown in the following equations [3]:
Ax
grav = sin 1 ( ) (9)
g
Ay
grav = sin 1 (10)
g cos
grav and grav angles are then used to compute the earth correction for the
rotational rates, p, q and r. the corrected rotational rates and grav and grav angles are
used to compute the Euler angle rates as shown in equation 11 [2].
sin sin
1 sin cos
& cos cos p
&
= 0 cos sin q (11)
r
& 0 sin
1
cos
1
cos cos
Once the Euler angle rates have been determined, the Euler angles are obtained by
integrating the Euler angle rates, taking grav and grav angles as initial values for the
integration. The model illustrating the above is shown in Figure 12.
16
Figure 12. Simulink model to compute Euler Angles
The pan tilt camera model primarily computes the pan and tilt angles that will be
required to point the camera at the target taking into account disturbances on the USV
and the camera platform. Figure 13. shows the Simulink model.
Using the Euler angles from the disturbance model, a [3x3] rotational matrix is
obtained as shown below in Figure 14.
17
Figure 14. Simulink model of transformation matrix
This matrix is then multiplied with the output from the geo-pointing model to
obtain the relative vector of the target with respect to the USV in body frame coordinates
as shown in the following equation:
dx dN
dy = R dE (12)
, ,
dz dD
R , ,
The rotation matrix is given by equation 4 previously. The relative vector
as obtained in equation 12 gives the position of the target with respect to the USV
without having to take into account the heading of the USV. The pan angle, , and tilt
angle, , required by the camera platform is given by the following equations:
dy
= tan 1 ( ) (13)
dx
dD
= tan 1 (14)
(dx) 2 + (dy ) 2
The pan tilt command model uses the command pan and tilt angles and places
them in the message being sent to the camera as shown in Figure 15.
18
Figure 15. Simulink model to command Alticam camera
The data values for the message are sent to the Alticam_cmd.m matlab function.
The matlab function forms the message in the required communication protocol and
sends it out to the camera. The Alticam_cmd function is attached in Appendix B.
19
THIS PAGE INTENTIONALLY LEFT BLANK
20
IV. SIMULATIONS AND RESULTS
A. EXPERIMENTAL SET UP
The platform that was constructed is made in such a way that it is able to roll,
pitch and yaw about the x, y and z axes respectively as shown in Figure 16. The output
from the IMU is fed to the Simulink model on a computer via RS232 port. The IMU
outputs a series of data. For this thesis, the data of interest is the gyro rotation rates, p, q
and r as well as the accelerometer values, Ax, Ay and Az. The Simulink model is set up
to read the incoming data at a sampling rate of 100 Hz which is the maximum data output
rate from the IMU.
21
B. SIMULINK SIMULATIONS AND RESULTS
In this section, the report will look at the various simulations that were run to test
the control system. The simulation results for each simulation is reviewed and analysed.
The main aim of analyzing the IMU data was to obtain the body frame rotational
Euler angles which can then be used as a disturbance signal to correct for the pan and tilt
angle commands to the camera. The data from the IMU is based in the inertial frame of
reference. Therefore, to be able to use the IMU data, it has to be further refined to obtain
the rotational angles of the USV and thus the camera platform in the body reference
frame. These angles, known as the integrated Euler angles is to be used to run further
simulations. However, in order to validate the data received from the IMU, it was desired
to compare the gravity based roll and pitch angles with the integrated Euler angles.
The experiment to verify the IMU data was done in three steps. The platform was
firstly rotated sideways about the x-axis to simulate the roll on the USV. The IMU data
were then used to plot a graph of roll angle, , with time for both the integrated roll angle
and the gravity based roll angle. The plot for the comparison of the roll angle is shown in
Figure 17. Secondly, the platform was rotated about the y-axis to simulate the pitch on
the USV. The resulting integrated pitch angle and the gravity based pitch angle is plotted
as shown in Figure 18. Finally, the platform was rotated about the z-axis to simulate the
yaw of the USV. In this case, only the integrated yaw angle, is obtained and is plotted
as shown in Figure 19.
22
Figure 17. Graph of roll angle against time
From the above graphs, it can be seen that the gravity based Euler angles and the
integrated Euler angles are very close. Thus, it can be concluded that the integrated Euler
angles obtained are good enough to be used in the model.
23
2. Target Tracking
The next step is to simulate the ability of the USV to track a target given the
geodetic coordinates. For this experiment, the USV and the target is assumed to be
located at an arbitrary points at Monterey Bay. Table 4 shows the coordinates of the USV
as well as the target vessel. The layout of the scenario is shown in Figure 20.
The USV is assumed to travel due north from its starting position to end up in the
ending point as shown in Figure 20. For this simulation, it is assumed that the camera
onboard the USV tracks the target under two different conditions, namely without
disturbance and with disturbance.
24
a. Target Tracking without Disturbance
In this scenario, the USV moves towards the target without taking into
account the disturbances due to the waves on the USV and the camera platform. This is
simulated by holding the platform level while the simulation runs. The results of the
simulation, showing the plots of pan angle,cmd against time and tilt angle,cmd against
time is shown in Figure 21.
Figure 21. Graph of pan/tilt angles against time for tracking without disturbance
From the graph, it can be seen that the pan angle steadily increases as the
USV approaches the target vessel. The tilt angle shows very little change, indicating
clearly the unchanged height between the target and the USV.
In this scenario, the USV moves towards the target taking into account the
disturbances due to the waves on the USV and the camera platform. This is simulated by
rotating the platform about each axis so as to simulate roll, pitch and yaw on the USV
25
during the simulation run. The results of the simulation, showing the plots of pan angle,
cmd against time and tilt angle, cmd against time for each of the disturbances is shown
in Figure 22. , Figure 23. and Figure 24.
Figure 22. Graph of pan/tilt angles against time for tracking with roll disturbance
Figure 23. Graph of pan/tilt angles against time for tracking with pitch disturbance
Figure 24. Graph of pan/tilt angles against time for tracking with yaw disturbance
26
From the graphs above, it can be seen that roll and pitch motions of the
USV cause only slight changes to the pan angle. However, the in these situations, the tilt
angles have more pronounced changes. However, in the third case, where there is yaw
motion on the USV, there is substantial changes to the pan angle of the camera while
there is only slight change to the tilt angle.
27
THIS PAGE INTENTIONALLY LEFT BLANK
28
V. CONCLUSIONS AND RECOMMENDATIONS
A. CONCLUSIONS
A control system for tracking a target vessel using the onboard camera was
developed in this thesis. The control system was required to enable the camera to track
the target vessel, taking into account the disturbances on the USV and the camera
platform caused by the waves. Using the geodetic coordinates of the target, the camera
onboard the USV was able to point to and track the target. It was also shown in the
simulation that the control system enabled the camera to track the target even with
disturbances on the USV.
In order to complete the thesis, hardware in the loop system was implemented by
incorporating the EO camera in the system. Using feed forward control, the simulation
was re-run and the results showed that the control system behaved in the manner that was
required. The control system was able to point the camera and track a single point
regardless of the motion of the platform in which in was mounted.
B. RECOMMENDATIONS
The thesis set about to start the phase in incorporating a target tracking system on
the Seafox USV. Due to the unavailability of the actual Seafox platform, a Simulink
model was built and tested on a bench top experimental set up.
The Alticam camera system is also capable of scan which gives the camera
additional coverage. It is also recommended that the scan of the camera be implemented
in future studies.
29
THIS PAGE INTENTIONALLY LEFT BLANK
30
APPENDIX
A. IMU PARAMETERS
31
B. MATLAB FUNCTIONS
%%This function converts the incoming pan and tilt command angles, converts them into
%%the data message, adds the header to the message and sends the final message with
%%the CRC message to the camera
function messageD=alticamTest(u)
u=uint8(u);
header = ['55';'aa';'07';'4d';'01';'00';'00';'98';'00'];%Header message
32
LIST OF REFERENCES
[1] S, Kragelund, NPS After Action Report, Center for Unmanned Vehicle Systems,
Naval Postgraduate School, Monterey, CA, USA, April 2008.
[4] R. A, Prince. Autonomous visual tracking of stationary targets using small unmanned
aerial vehicles. M.S. Thesis, Naval Postgraduate School, Monterey, CA, 2004.
[6] O., Katsuhiko. Modern control engineering, Fourth Edition. New Jersey:
Prentice Hall, 2002.
33
THIS PAGE INTENTIONALLY LEFT BLANK
34
INITIAL DISTRIBUTION LIST
1. Anthony J. Healey
Naval Postgraduate School
Monterey, California
2. Sean Kragelund
Naval Postgraduate School
Monterey, California
35