Applied Sciences: Autonomous Underwater Vehicles: Localization, Navigation, and Communication For Collaborative Missions

Download as pdf or txt
Download as pdf or txt
You are on page 1of 37

applied

sciences
Review
Autonomous Underwater Vehicles: Localization,
Navigation, and Communication for
Collaborative Missions
Josué González-García 1 , Alfonso Gómez-Espinosa 1, * , Enrique Cuan-Urquizo 1 ,
Luis Govinda García-Valdovinos 2, *, Tomás Salgado-Jiménez 2 and
Jesús Arturo Escobedo Cabello 1
1 Tecnologico de Monterrey, Escuela de Ingeniería y Ciencias, Av. Epigmenio González 500, Fracc. San Pablo,
Querétaro 76130, Mexico; [email protected] (J.G.-G.); [email protected] (E.C.-U.);
[email protected] (J.A.E.C.)
2 Energy Division, Center for Engineering and Industrial Development-CIDESI, Santiago de Queretaro,
Queretaro 76125, Mexico; [email protected]
* Correspondence: [email protected] (A.G.-E.); [email protected] (L.G.G.-V.);
Tel.: +52-442-238-3302 (A.G.-E.)

Received: 6 January 2020; Accepted: 6 February 2020; Published: 13 February 2020 

Abstract: Development of Autonomous Underwater Vehicles (AUVs) has permitted the


automatization of many tasks originally achieved with manned vehicles in underwater environments.
Teams of AUVs designed to work within a common mission are opening the possibilities for
new and more complex applications. In underwater environments, communication, localization,
and navigation of AUVs are considered challenges due to the impossibility of relying on radio
communications and global positioning systems. For a long time, acoustic systems have been the
main approach for solving these challenges. However, they present their own shortcomings, which
are more relevant for AUV teams. As a result, researchers have explored different alternatives. To
summarize and analyze these alternatives, a review of the literature is presented in this paper. Finally,
a summary of collaborative AUV teams and missions is also included, with the aim of analyzing their
applicability, advantages, and limitations.

Keywords: Autonomous Underwater Vehicle(s); collaborative AUVs; underwater localization

1. Introduction
Over the years, a large number of AUVs are being designed to accomplish a wide range of
applications in the scientist, commercial, and military areas. For oceanographic studies, AUVs have
become very popular to explore, collect data, and to create 3D reconstructions or maps [1,2]. At the oil
and gas industry, AUVs inspect and repair submerged infrastructures and also have great potential in
search, recognition, and localization tasks like airplane black-boxes recovery missions [3,4]. AUVs are
also used for port and harbor security tasks such as environmental inspection, surveillance, detection
and disposal of explosives and minehunting [5,6].
Design, construction, and control of AUVs represent such a challenging work for engineers who
must face constraints they do not encounter in other environments. Above water, most autonomous
systems rely on radio or spread-spectrum communications along with global positioning. In
underwater environments, AUVs must rely on acoustic-based sensors and communication. Design and
implementation of new technologies and algorithms for navigation and localization of AUVs—especially
for collaborative work—is a great research opportunity.

Appl. Sci. 2020, 10, 1256; doi:10.3390/app10041256 www.mdpi.com/journal/applsci


Appl. Sci. 2020, 10, 1256 2 of 37

Before establishing a collaborative scheme for AUVs, the problem of localization and navigation
must be addressed for every vehicle in the team. Traditional methods include Dead-Reckoning (DR)
and Inertial Navigation Systems (INS) [7]. DR and INS are some of the earliest established methods to
locate an AUV [8]. These systems rely on measurements of the water-speed and the vehicle’s velocities
and accelerations that, upon integration, leads to the AUV position. They are suitable for long-range
missions and have the advantage to be passive methods—they do not need either to send or receive
signals from external systems—resulting in a solution immune to interferences. Nevertheless, the
main problem of them is that the position error growths over time—which is commonly known as
accuracy drift—as a result of different factors such as the ocean currents and the accuracy of the
sensors itself, which are not capable of sensing the displacements produced by external forces or the
effects of earth’s gravity. The use of geophysical maps to matching the sensors measurements is an
alternative to deal with the accuracy drifts of the inertial systems. This method is known as Geophysical
Navigation (GN) [9] and allows accomplishing longer missions maintaining a position error relatively
low. However, there is a need for having the geophysical maps available before the mission, which is
one the main disadvantages of GN along with the computational cost of comparing and matching the
map with the sensors data. Acoustic ranging systems have been another common alternative for AUV
navigation [10]. These systems can be implemented using acoustic transponders to locate an AUV in
either global or relative coordinates. However, most of them require complex infrastructure and the
cost of such deployments could be higher compared with other methods. In recent years, researchers
are exploring new alternatives for AUV localization and navigation. Optical technologies have become
very popular for robots and vehicles at land or air environments [11], but face tough conditions in
underwater environments that have delayed the development of such technologies for AUVs. When
the underwater conditions permit a proper light propagation and detection, visual-based systems
can improve significantly the accuracy of the position estimations and reach higher data rates that
acoustic systems. Finally, recent advancements in terms of sensor fusion schemes and algorithms are
contributing to the development of hybrid navigation systems, which takes the advantages of different
solutions to overcome their weaknesses. A sensor fusion module improves the AUV state estimation
by processing and merging the available sensors data [12]. Some of the common sensors used for it are
the inertial sensors of an INS, Doppler Velocity Loggers (DVL), and depth sensors. Recently, the INS
measurements are also being integrated with acoustic/vision-based systems to produce a solution that,
beyond reducing the accuracy drifts of the INS, will have high positioning accuracy in short-ranges.
All these technologies are addressed in Section 2 of this work, which is organized as shown in Figure 1,
including the main sensors used and the different approaches taken.
After solving the problem of self-localization and navigation, other challenges must be addressed
to implement a collaborative team of AUVs. Since there is a need for sharing information between
the vehicles, communication is an important concern. The amount and size of the messages will
depend on the collaborative scheme used, the number of vehicles and the communication system
capabilities. Acoustic-based performs better than light-based communication in terms of range, but
not in data rates. It also suffers from many other shortcomings such as small bandwidth, high latency
and unreliability [13]. Despite its notable merits in the terrestrial wireless network field, radio-based
communication has had very few practical underwater applications to date [14]. The collaborative
navigation scheme is also a mandatory issue to be considered. The underwater environment is complex
by itself to navigate at, and now multiple vehicles have to navigate among each other. A proper
formation has to ensure safe navigation for every single vehicle. These topics are analyzed in Section 3,
which also includes a review on the main collaborative AUV mission: surveillance and intervention.
Since there is no need to interact with the environment, survey missions are simpler to implement
and have been performed successfully for different applications, such as mapping or object searching
and tracking. Intervention missions are usually harder due to the complexity of the manipulators or
actuators needed. In either case, since an experimental set up is difficult to be reached, many of the
efforts made are being tested in simulation environments.
sensors data [12]. Some of the common sensors used for it are the inertial sensors of an INS, Doppler
Velocity Loggers (DVL), and depth sensors. Recently, the INS measurements are also being
integrated with acoustic/vision-based systems to produce a solution that, beyond reducing the
accuracy drifts of the INS, will have high positioning accuracy in short-ranges. All these technologies
are addressed in Section 2 of this work, which is organized as shown in Figure 1, including the main
Appl. Sci. 2020, 10, 1256 3 of 37
sensors used and the different approaches taken.

Figure
Figure 1. 1.Autonomous
AutonomousUnderwater
Underwater Vehicle
Vehicle (AUV)
(AUV) technologies
technologiesfor
forlocalization
localizationand
andnavigation.
navigation.

2. Navigation and Localization


Navigation and localization are two of the most important challenges for underwater robotics [11].
These continue being issues to solve for many applications such as collaborative missions. DR and INS
are traditional methods for AUV localization and navigation but have issues as the decrement of the
pose accuracy over time. In addition to traditional technologies, this problem was addressed in the past
with acoustic technologies as Long Baseline (LBL) [15–17], Short Baseline (SBL) [18,19], or Ultra-Short
Baseline (USBL) [20–23] systems. Acoustic positioning systems, though, require careful calibration of
the sound velocity, as they suffer from multipath Doppler effects and susceptibility to thermoclines;
they also have a limited range and accuracy [24]. Geophysical Navigation (GN) is also a solution for
vehicle localization. Matching algorithms such as TERrain COntour-Matching (TERCOM) and Sandia
Inertial Terrain Aided Navigation (SITAN) are relatively mature, however new algorithms are currently
being proposed [25]. The main limitation of GN systems is the need for a geophysical map to compare
the measurements from the sensors. Visual-based systems have been a trend for vehicle navigation at
land and air environments, but there are several problems related to light propagation and detection
in underwater environments. Additionally, most visual-based methods for autonomous navigation
depend on the presence of features in the images taken, which, even if they exist, are difficult to extract
due to the limited illumination conditions. In recent years, the field of AUV localization is shifting
from old technologies, towards more dynamic approaches that require less infrastructure and offers a
better performance [13]. This section presents a survey on single-vehicle localization and navigation
technologies—including different methods, sensor, and approaches—in the understanding that those
can be applied in multi-vehicle collaborative navigation schemes.

2.1. Dead-Reckoning and Inertial Navigation


The simplest method to obtain a position for a moving vehicle is by integrating its velocity in
time. This method is known as dead-reckoning [8]. DR requires to know the velocity and direction of
the vehicle, which is usually accomplished with a compass and a water speed sensor. The principal
problem is related to the presence of an ocean current, as illustrated in Figure 2, because it will add a
velocity component to the vehicle which is not detected by the speed sensor. Then, the accuracy of the
method will be strongly affected especially when the vehicle navigates at a low velocity.
Appl. Sci. 2020, 10, 1256 4 of 37
Appl. Sci. 2020, 10, x FOR PEER REVIEW 4 of 36

Figure 2. Dead-Reckoning drift effect.


Figure 2. Dead-Reckoning drift effect.

Inertial sensors can be used to improve the navigation accuracy and reliability of DR methods.
Inertial sensors can be used to improve the navigation accuracy and reliability of DR methods.
The INS consist of three mutually-orthogonal accelerometers aligned to a gyroscopic reference frame.
The INS consist of three mutually-orthogonal accelerometers aligned to a gyroscopic reference frame.
Measured accelerations are integrated to obtain the desired velocity, position, and attitude information
Measured accelerations are integrated to obtain the desired velocity, position, and attitude
of the vehicle. The fact that inertial navigation is self-contained—it neither emits nor receives any
information of the vehicle. The fact that inertial navigation is self-contained—it neither emits nor
external signal—is one of its most significant strengths, making it a stealthy navigation solution,
receives any external signal—is one of its most significant strengths, making it a stealthy navigation
immune to interference or jamming [26]. However, the error on the pose estimations is known to
solution, immune to interference or jamming [26]. However, the error on the pose estimations is
increase over time, and depends on the accuracy of the sensors used. Mathematically, the total
known to increase over time,
.. and depends on the accuracy of the sensors used. Mathematically, the
acceleration, denoted as r, can be expressed as follows [27]:
total acceleration, denoted as 𝑟, can be expressed as follows [27]:
..
r𝑟 =
= a𝑎 +
+ g,
𝑔, (1)
(1)
where aa is
where is the
the acceleration
acceleration calculated
calculated byby the
the INS
INS andand gg isis the
the gravitational
gravitational acceleration.
acceleration. SinceSince the
the
accelerometers do not sense the gravity, the position of the vehicle
accelerometers do not sense the gravity, the position of the vehicle obtained by integrating the obtained by integrating the
acceleration
acceleration measurements
measurements will will result
result with
with anan error.
error. Gyroscopic
Gyroscopic drifts
drifts are
are also
also aa source
source ofof error
error that
that
can
can result in significant misalignments between the sensor frame and the earth-fixed reference frame,
result in significant misalignments between the sensor frame and the earth-fixed reference frame,
causing
causing navigation
navigation errors
errors that
that also grows over
also grows over time.
time. Using
Using aa Global
Global Positioning
Positioning System
System (GPS)
(GPS) is is aa
common method used to correct these errors. However, to correct the
common method used to correct these errors. However, to correct the error accumulated by the INS,error accumulated by the INS,
the vehicle must
the vehicle must go go to
to the
thesurface
surfaceto toobtain
obtainaanewnewGPS GPSlocation
locationatatregular
regular intervals,
intervals, which
which cancan result
result in
in a waste of time and resources. Integration of an INS and a GPS data can
a waste of time and resources. Integration of an INS and a GPS data can also be a complex process, also be a complex process,
since
since those
those systems
systems are are based
based inin completely
completely different
different principles.
principles.
Even
Even if the instruments were perfect, the estimations of
if the instruments were perfect, the estimations of an
an INS
INS would
would result
result with
with anan error
error [9].
[9].
The gyroscopic
The gyroscopic reference
reference frame
frame isis aligned
aligned toto aa reference
reference ellipsoid
ellipsoid model
model of of the
the earth.
earth. The
The reference
reference
ellipsoid conforms roughly
ellipsoid conforms roughlyto tothe
theshape
shapeofofthe
theearth,
earth,andandininparticular
particular toto mean
mean seasea level.
level. If the
If the mass
mass of
of the earth were homogeneously distributed within the ellipsoid, the gravity
the earth were homogeneously distributed within the ellipsoid, the gravity vector would be normal to vector would be normal
to
thethe reference
reference ellipsoid
ellipsoid surface.
surface. However,
However, duedueto theto the inhomogeneous
inhomogeneous distribution
distribution of theof the
earthearth mass,
mass, the
the gravity vector can have significant components tangential to the
gravity vector can have significant components tangential to the reference ellipsoid surface (known reference ellipsoid surface
(known
as vertical as deflections)
vertical deflections)
as shownasinshown Figure in
3. Figure
Since an3.INS Since an INS
cannot cannot distinguish
distinguish between the between the
tangential
tangential components of earth gravity and the horizontal acceleration of
components of earth gravity and the horizontal acceleration of the vehicle, these gravity disturbances the vehicle, these gravity
disturbances
cause errors in cause errors
the INS in theand
velocity INSposition
velocity estimations.
and position estimations.
Appl. Sci. 2020, 10, 1256 5 of 37
Appl. Sci. 2020, 10, x FOR PEER REVIEW 5 of 36

Figure 3.
Figure Vertical deflection.
3. Vertical deflection.

The latest advances in MEMS inertial sensors are having profound effects on the recent availability
The latest advances in MEMS inertial sensors are having profound effects on the recent
of MEMS-Inertial Measurement Units (IMUs), that has become clearly attractive for a wide range
availability of MEMS-Inertial Measurement Units (IMUs), that has become clearly attractive for a
of applications where size, weight, power, and cost are key considerations [28]. This set of sensors
wide range of applications where size, weight, power, and cost are key considerations [28]. This set
can be used to implement an Attitude and Heading Reference System (AHRS) or an INS. Some
of sensors can be used to implement an Attitude and Heading Reference System (AHRS) or an INS.
MEMS-based systems commercially available are showed in Table 1. Nevertheless, despite technological
Some MEMS-based systems commercially available are showed in Table 1. Nevertheless, despite
developments in inertial sensors, INS are underway to reduce the accuracy drift at the level of a few
technological developments in inertial sensors, INS are underway to reduce the accuracy drift at the
meters uncertainty over one hour of unaided inertial navigation [29]. Currently, damping techniques,
level of a few meters uncertainty over one hour of unaided inertial navigation [29]. Currently,
using water speed measurements, are used to control velocity and position errors caused by uncorrected
damping techniques, using water speed measurements, are used to control velocity and position
vertical deflection and inertial sensor errors [30]. However, this is at the cost of introducing an additional
errors caused by uncorrected vertical deflection and inertial sensor errors [30]. However, this is at the
error source (the water-speed/ground-speed difference caused by ocean currents). Another alternative
cost of introducing an additional error source (the water-speed/ground-speed difference caused by
to reduce these effects is the use of maps of vertical deflection compensation values, as a function of
ocean currents). Another alternative to reduce these effects is the use of maps of vertical deflection
latitude and longitude, to compensate the measured accelerations.
compensation values, as a function of latitude and longitude, to compensate the measured
accelerations.
Table 1. Commercial Inertial Measurement Unit (IMU)-Attitude and Heading Reference System
(AHRS) systems
Table 1. Commercial Inertial Measurement Unit (IMU)-Attitude and Heading Reference System
HeadingsystemsPitch and Roll
(AHRS) Data Rate Depth Rated
Manufacturer Product Name Accuracy/ Accuracy/
(Hz) (m)
Heading
Resolution Pitch and Roll
Resolution
Data Rate Depth
Manufacturer
Impact Subsea Product Name
ISM3D [31] Accuracy/
±0.5◦ /0.1◦ Accuracy/
±0.07◦ /0.01◦ 250 1000–6000
(Hz) Rated (m)
Resolution Resolution
Seascape Seascape
Impact ±0.5◦ /0.01◦ ±0.5◦ /0.01◦ 400 750
SubseaSubsea UW9XIMU-01
ISM3D [31][32] ±0.5°/0.1° ±0.07°/0.01° 250 1000–6000
Inertial Labs Seascape
AHRS-10P [33] ±0.6◦ /0.01◦ ±0.08◦ /0.01◦ 200 600
Seascape Subsea UW9XIMU-01 ±0.5°/0.01° ±0.5°/0.01° 400 750
SBG Systems Ellipse2-N [34] ±1.0 /-
◦ ±0.1 /-
◦ 200 -
[32]
TMI-Orion DSPRH [35] ±0.5◦ /0.1◦ ±0.5◦ /0.1◦ 100 500–2000
Inertial Labs AHRS-10P [33] ±0.6°/0.01° ±0.08°/0.01° 200 600
VectorNav VN-100 [36][34] ±2.0 ◦ /0.05◦ ±1.0 ◦ /0.05◦ 400
SBG Systems Ellipse2-N ±1.0°/- ±0.1°/- 200 --
TMI-Orion
XSENS DSPRH[37]
MTi-600 [35] ±1.0◦ /-
±0.5°/0.1° ±0.2◦ /-
±0.5°/0.1° 100
400 500–2000
-
VectorNav VN-100 [36] ±2.0°/0.05° ±1.0°/0.05° 400 -
2.2. Acoustic
XSENS NavigationMTi-600 [37] ±1.0°/- ±0.2°/- 400 -
Compared with other signals, such radio and electromagnetic, acoustic-based signals propagates
2.2. Acoustic Navigation
better in water and can reach considerable distances. This allows the use of acoustic transponders to
Compared
navigate an AUV.with
Some other signals, such
of the navigation radio
methods andonelectromagnetic,
based acoustic-based
acoustic signals are signals
the Sound Navigation
propagates
and Rangingbetter in water
(SONAR), and and can ranging.
acoustic reach considerable distances. This allows the use of acoustic
Appl. Sci. 2020, 10, x FOR PEER REVIEW 6 of 36

transponders
Appl. Sci. 2020, 10,to1256
navigate
an AUV. Some of the navigation methods based on acoustic signals are
6 ofthe
37
Sound Navigation and Ranging (SONAR), and acoustic ranging.

2.2.1. SONAR
2.2.1. SONAR
There
There exist
existdifferent methods
different to employ
methods a SONAR
to employ a for AUV navigation.
SONAR for AUV Two basic configurations
navigation. Two basic
are the side-scan SONAR [10] and the Forward-Looking SONAR (FLS)
configurations are the side-scan SONAR [10] and the Forward-Looking SONAR (FLS) [38]. [38]. Both of them are Both
used to
of
detect
them areobjects
usedwhich canobjects
to detect be: seabed
whichchanges, rocks, other
can be: seabed vehicles,
changes, rocks,and
othereven marine
vehicles, andspecies. When
even marine
an AUV When
species. is in operation,
an AUV isitin must be able ittomust
operation, detect
bethese objects
able to detecttothese
update its navigation
objects to update trajectory and
its navigation
avoid collisions, which is known as obstacle avoidance.
trajectory and avoid collisions, which is known as obstacle avoidance.
For
For the
the side-scan
side-scan SONAR,
SONAR, the the transducer device scans
transducer device scans laterally
laterally when
when attached
attached to to the
the AUV,
AUV, as
as
illustrated in Figure 4. A series of acoustic pings are transmitted and then received,
illustrated in Figure 4. A series of acoustic pings are transmitted and then received, the time of the the time of
the returns and the speed of sound in water is used to determine the existence
returns and the speed of sound in water is used to determine the existence of features located of features located
perpendicular
perpendicular to to the
the direction
direction ofof motion.
motion.

Figure 4. AUV
AUV equipped
equipped with two side-scan Sound Navigation and Ranging (SONARs).

The FLS
FLS uses
usesaasearchlight
searchlightapproach, steering
approach, thethe
steering sonar beam
sonar beam scanning forward
scanning of the
forward of vessel and
the vessel
streaming soundings on a continuous basis. FLS can be placed at different locations on
and streaming soundings on a continuous basis. FLS can be placed at different locations on the the vehicle, as
shown
vehicle,inasFigure
shown5,intoFigure
ensure5,that the AUV
to ensure can
that thedetect
AUVobstacles
can detectfrom different
obstacles fromdirections.
different directions.

Figure 5. Forward Looking SONAR (FLS) placed at vertical and horizontal orientations.

Two-dimensional
Two-dimensional images
images can
can be
be produced
produced which
which survey
survey the
the ocean
ocean and
and the
the features
features on
on it.
it. These
These
images,
images, while
while indicating
indicating what
what exists
exists on the ocean
on the ocean or
or seafloor,
seafloor, do
do not
not contain
contain localization
localization information
information
either
either relative
relative or
or global.
global.
Traditional
Traditionalobstacle
obstacleavoidance
avoidanceplanning
planningmethods
methodsinclude potential
include field,
potential Bandler
field, and Kohout
Bandler and Kohout (BK)
products, particle
(BK) products, swarmswarm
particle optimization, fuzzy controller,
optimization, etc. Galarza
fuzzy controller, et al. [39]
etc. Galarza et designed an obstacle
al. [39] designed an
avoidance algorithm for an AUV. The obstacle detection system disposes of a SONAR
obstacle avoidance algorithm for an AUV. The obstacle detection system disposes of a SONAR and and its use
guarantees the safety
its use guarantees theof the AUV
safety of thewhile
AUVnavigating. Obstacle
while navigating. avoidance
Obstacle is performed
avoidance based on
is performed a fuzzy
based on
reactive architecture
a fuzzy reactive for different
architecture forward forward
for different speeds ofspeeds
the vehicle. The algorithm
of the vehicle. was validated
The algorithm under a
was validated
computational simulation environment running in MATLAB. During the simulated route, the vehicle
Appl. Sci. 2020, 10, x FOR PEER REVIEW 7 of 36

Appl.
under Sci.a2020, 10, 1256
computational simulation environment running in MATLAB. During the simulated route, 7 of 37

the vehicle remained at a minimum distance of 5 m of the obstacles, reducing its reference forward
speed of 1 m/s to values between 0.02 m/s and 0.4 m/s; thus, safe navigation around obstacles was
remained at a minimum distance of 5 m of the obstacles, reducing its reference forward speed of 1 m/s
achieved without losing the trajectory of navigation and reaching all the waypoints. Braginsky et al.
to values between 0.02 m/s and 0.4 m/s; thus, safe navigation around obstacles was achieved without
[40] proposed an obstacle avoidance methodology based in the data collected from two FLS placed
losing the trajectory of navigation and reaching all the waypoints. Braginsky et al. [40] proposed an
in horizontal and vertical orientation. FLS data is processed to provide obstacle detection information
obstacle avoidance methodology based in the data collected from two FLS placed in horizontal and
in the xz-and xy-planes, respectively. For the horizontal obstacle avoidance, authors used a two-layer
vertical orientation. FLS data is processed to provide obstacle detection information in the xz-and
algorithm. The first process of the algorithm is based on BK products of fuzzy relation, as a
xy-planes, respectively. For the horizontal obstacle avoidance, authors used a two-layer algorithm.
preplanning method; and the second is a reactive approach based on potential field and edge
The first process of the algorithm is based on BK products of fuzzy relation, as a preplanning method;
detection methods. In case that the horizontal approach fails finding a path to safely avoid the
and the second is a reactive approach based on potential field and edge detection methods. In case that
obstacle, a reactive vertical approach is activated. The sonar used in the experimentation has a
the horizontal approach fails finding a path to safely avoid the obstacle, a reactive vertical approach is
detection range of up to 137 m and operated at a 450 kHz frequency. During the test, the mission
activated. The sonar used in the experimentation has a detection range of up to 137 m and operated at
definition for the AUV was to move from a starting point to a target point. Despite the maximum
a 450 kHz frequency. During the test, the mission definition for the AUV was to move from a starting
range of the FLS, decisions were made when an obstacle was within 40 m of the AUV. Lin et al. [41]
point to a target point. Despite the maximum range of the FLS, decisions were made when an obstacle
implemented a Recurrent Neural Network (RNN) with Convolution (CRNN) for underwater
was within 40 m of the AUV. Lin et al. [41] implemented a Recurrent Neural Network (RNN) with
obstacle avoidance. Offline training and testing were adopted to modify the neural network
Convolution (CRNN) for underwater obstacle avoidance. Offline training and testing were adopted to
parameters of the AUV autonomous obstacle avoidance learning system, so self-learning is applied
modify the neural network parameters of the AUV autonomous obstacle avoidance learning system, so
to the collision avoidance planning. Combining this learning system with FLS simulation data
self-learning is applied to the collision avoidance planning. Combining this learning system with FLS
enables online autonomous obstacle avoidance planning in an unknown environment. Simulation
simulation data enables online autonomous obstacle avoidance planning in an unknown environment.
results showed that the planning success rate was 98% and 99% for the proposed CRNN algorithms;
Simulation results showed that the planning success rate was 98% and 99% for the proposed CRNN
meanwhile, it was 88% and 96% for the RNN algorithms. Authors concluded than the CRNN obstacle
algorithms; meanwhile, it was 88% and 96% for the RNN algorithms. Authors concluded than the
avoidance planner has the advantages of short training time, simple network structure, better
CRNN obstacle avoidance planner has the advantages of short training time, simple network structure,
generalization performance, and reliability than an RNN planner.
better generalization performance, and reliability than an RNN planner.

2.2.2.
2.2.2. Acoustic
Acoustic Ranging
Ranging
In acoustic ranging
In acoustic ranging positioning
positioning systems,
systems, AUVs
AUVs are equipped with
are equipped with anan acoustic
acoustic transmitter
transmitter thatthat
establishes communication with
establishes communication withaaset
setofofhydrophones.
hydrophones.Knowing
Knowingthe thepropagation
propagation velocity
velocity of of sound
sound in
in underwater, the distance between the AUV and hydrophones can be calculated
underwater, the distance between the AUV and hydrophones can be calculated through the propagation through the
propagation time ofsignal.
time of the acoustic the acoustic signal. Then,
Then, a location for thea AUV
location
withfor the AUV
respect to thewith
set ofrespect to the can
hydrophones set be
of
hydrophones can be obtained by geometric methods. One of the differences between
obtained by geometric methods. One of the differences between acoustic systems is the arrangement acoustic systems
is
of the
thearrangement
hydrophones. of In
theLBL
hydrophones. In LBL systems,
systems, hydrophones hydrophones
are fixed within a are fixed within
structure or any aother
structure
known or
any other known
underwater point ofunderwater point of as
reference—known reference—known
landmark—[15]. The as landmark—[15].
length of the baseline The length
can beof upthe
to
baseline can be up to hundreds of meters. In SBL and USBL systems, the
hundreds of meters. In SBL and USBL systems, the hydrophones are placed on buoys or another hydrophones are placed on
buoys
vehicleorat another vehicle
the surface, evenaton
thea surface,
second AUV.even For
on aSBL
second AUV.
systems, For SBL
baseline systems,
length baseline in
is measured length
metersis
measured
and works in bymeters and aworks
measuring byposition
relative measuring a relative
between position sound
the reference between the reference
source sound source
and the receiving array;
and the receiving array; meanwhile, baseline for USBL systems is in decimeters
meanwhile, baseline for USBL systems is in decimeters and the relative location from the hydrophone and the relative
location from the
to the moving hydrophone
target to the
is calculated moving target
by measuring is calculated
the phase by measuring
differences the phase
between acoustic differences
elements [18].
between acoustic elements [18]. In either arrangement, hydrophones are generally
In either arrangement, hydrophones are generally located by Global Navigation Satellite Systems. In located by Global
Navigation
Figure 6, allSatellite Systems. In Figure
three configurations 6, all three
for acoustic configurations
localization systemsforareacoustic
shown.localization systems are
shown.

(a) (b) (c)


Figure 6. Acoustic
Acoustic localization
localization systems: (a)
(a) Long
Long Baseline,
Baseline, (b)
(b) Short
Short Baseline,
Baseline, (c)
(c) Ultra-Short
Ultra-Short Base Line.
Appl. Sci. 2020, 10, 1256 8 of 37

Appl. Sci. 2020, 10, x FOR PEER REVIEW 8 of 36


A schematic diagram of an SBL system is represented in Figure 7. Three hydrophones, represented
A H2,
by H1, schematic
and H3, diagram
are locatedofatan
theSBL
pointssystem is represented
O, N, and in Figure
M, at the origin 7. Threeframe
of the reference hydrophones,
and along
represented
the x and y axes, respectively. The distance from the detected vehicle to the i hydrophonereference
by H1, H2, and H3, are located at the points O, N, and M, at the origin of the is called
frame
obliqueand along which
distance, the x is
and y axes,
denoted byrespectively.
Di , with i = 1,The
2, 3.distance from the detected vehicle to the i
hydrophone is called oblique distance, which is denoted by Di, with i = 1, 2, 3.

Figure 7.
Figure Schematic diagram
7. Schematic diagram for
for Short
Short Baseline
Baseline (SBL)
(SBL) system.
system.

The vehicle receives a signal from a hydrophone (H1) and sends a reply that is received by all
The vehicle receives a signal from a hydrophone (H1) and sends a reply that is received by all
the hydrophones (H1, H2, and H3); then, signal run time is measured. The propagation time of the
the hydrophones (H1, H2, and H3); then, signal run time is measured. The propagation time of the
acoustic signal from the transmitter in the vehicle to the hydrophone base (T ) is used to obtain the
acoustic signal from the transmitter in the vehicle to the hydrophone base (Tii) is used to obtain the
oblique distance with the equation [18]:
oblique distance with the equation [18]:
Di = V · Ti , (2)
𝐷 = 𝑉 · 𝑇, (2)
where V is the nominal speed for underwater acoustic signals and is used as V = 1435 m/s. The
where
locationV of
is the
the nominal
vehicle’s speed for underwater
transmitter—point acoustic
P—with signals and
coordinates Xpis
, Yused asZVp ,=can
p , and 1435
bem/s. The location
calculated using
of the vehicle’s
a traditional SBLtransmitter—point
model as follows: P—with coordinates Xp, Yp, and Zp, can be calculated using a
 
traditional SBL model as follows: P = Xp , Yp , Zp (3)
𝑃=
D2(𝑋
− ,D𝑌22 ,+
𝑍 N) 2 (3)
Xp = 1 (4)
𝐷 −4N 𝐷 +𝑁
𝑋 =D2 − D2 + M2 (4)
4𝑁
3
Yp = 1 (5)
4M
𝐷 − 𝐷 +𝑀
𝑌 = (5)
 
i1  1  2  12 
D1 − Xp2 + Yp2 + D22 − Xp − N 4𝑀

h 2 2 2 
+ Yp2 − D23 − Xp2 + Yp − M
 2

ZP =  /3 (6)

 

 
= 𝐷an −
There𝑍 exists 𝑋 +between
error 𝑌 +the𝐷 measured
− 𝑋 − 𝑁position
+ 𝑌 and − the
𝐷 actual
− 𝑋 +position
𝑌 − 𝑀of the transmitter.
/3 (6)
Among other factors, it is caused by not considering the variations of sound velocity produced by
There
changes exists
in the an error between
underwater the measured
environment position
conditions such asand the actual
depth, position
temperature, of the transmitter.
density, and salinity.
Among other factors, it is caused by not considering the variations of sound velocity produced
The accuracy of an acoustic positioning system will depend on different factors such as the distance and by
changes in the underwater environment conditions such as depth, temperature, density, and
depth operational range, the number and availability of hydrophones, and the operational frequency. salinity.
The
A accuracy
few of anbaseline
commercial acousticacoustic
positioning system
systems andwill depend
accuracy on differentare
specifications factors
shown such
in as the 2.
Table distance
and depth operational range, the number and availability of hydrophones, and the operational
frequency. A few commercial baseline acoustic systems and accuracy specifications are shown in
Table 2.
Appl. Sci. 2020, 10, 1256 9 of 37

Table 2. Commercial acoustic positioning systems.

Name Type Accuracy Range (m) Operating Depth Range (m)


EvoLogics S2C R LBL [42] LBL Up to 0.15 200–6000
GeoTag seabed positioning system [43] LBL Up to 0.20 500
µPAP acoustic positioning [44] USBL Not specified 4000
SUBSONUS [45] USBL 0.1–5 1000
1% of distance range
UNDERWATER GPS [46] SBL/USBL (1 m for a 100 m 100
operating range)

Although acoustic systems have been used in the past, they are still used as the main localization and
navigation system for AUVs or teams of AUVs and Unmanned Surface Vehicles (USVs). Batista et al. [47]
worked on a filter for combining LBL and USBL systems to estimate position, linear velocity, and
attitude of underwater vehicles. This filter considers an underwater vehicle moving in a scenario
where there is a set of fixed landmarks installed in an LBL configuration and the vehicle is equipped
with a USBL acoustic positioning system. The filter achieves good performance even in the presence of
sensor noise under a simulated environment. The resulting solution ensures a quick convergence of
the estimation error to zero for all initial conditions. However, it could not be a practical solution for
some cases since it requires a complex infrastructure.
A coordinated navigation of surface and underwater vehicles is proposed by Vasilijević et al. [48].
The proposed scheme has the purpose to serve as a first-responder monitoring team on environmental
disasters at oceans. The USV is connected to a ground station via Wi-Fi for control and monitoring;
meanwhile, acoustic communication is used to send instructions to the AUV and to retrieve information
from it. To locate the vehicles, a Global Positioning System (GPS) is mounted on the USV so it can
get a position on geographic coordinates. Once the USV gets a location, a USBL system is used to get
a relative location of the AUV regarding the surface vehicle. An algorithm is run to convert them
to a global position so the control station can know where both vehicles are. This allows the precise
localization of pollution or any other problems found by the vehicles and is intended to help to plan a
rapid response. As long as the USV and AUV remain on a close-range for communication, limitations
on the USBL system are not a problem in this scenario. Sarda et al. [49] used a digital USBL system for
AUV recovery. The AUV was equipped with a receiver array of four transducers and a transponder
array was mounted on an USV which served as recovery station. The system proposed is not only
capable of estimating the distance between the AUV and the recovery location, but it is also able to
measure horizontal and vertical bearings. The system has an update period of 3 s and has an accuracy
of less than a meter. Its main limitation is the sensing range, AUV must be 25 m within the target
localization, or the system measurements are considered as erroneous. Field experiments showed a
success rate of 37.5% at recovering the AUV.
Range-only—also known as Single-beacon—localization is another alternative to traditional
acoustic localization systems that has gained attention in recent years. The concept of
range-only/single-beacon positioning can be divided into two groups depending on the way they are
used [50]: (i) as a navigational aid for a moving vehicle, or (ii) localization of a stationary or moving
target. All these methods use a set of ranges between a target and different static nodes, known as
anchor nodes. Typically, these ranges can be obtained using the time of flight given the speed of sound
in water. Then, the unknown underwater target position problem can be solved using trilateration,
where in general, three or more points are needed in 2D dimensions and, at least, four points in
3D scenarios.
A method for target positioning from a moving vehicle—which periodically measures the range
to the underwater target—is represented in Figure 8.
Appl. Sci. 2020, 10, 1256 10 of 37
Appl. Sci. 2020, 10, x FOR PEER REVIEW 10 of 36

Figure 8. Range-only/Single-beacon
Figure positioning
8. Range-only/Single-beacon of a fixed
positioning targettarget
of a fixed fromfrom
a moving vehicle.
a moving vehicle.

The The
underwater target
underwater position
target (Pt ) is
position (Pcalculated using the moving vehicle positions (Pi ) and
t) is calculated using the moving vehicle positions
the the
(Pi) and
ranges measured between the moving vehicle and the target
ranges measured between the moving vehicle and the target (r ) expressed as:
i (𝑟 ) expressed as:

ri = kP t −‖P
r̅ = ‖i+ w
Pi k−+Pw (7) (7)
where 𝑤i is a zero mean Gaussian measurement error. Different methods can be used to solve the
where wi is a zero mean Gaussian measurement error. Different methods can be used to solve the
system and find the target position through ranges: linearize the function and find a closed-form least
system and find the target position through ranges: linearize the function and find a closed-form least
squares solution; or use an iterative minimization algorithm to minimize a cost function related to
squares solution; or use an iterative minimization algorithm to minimize a cost function related to the
the maximum likelihood estimate.
maximum likelihood estimate.
Bayat et al. [51] presented an AUV localization system that relied on the computation of the
Bayat et al. [51] presented an AUV localization system that relied on the computation of the
ranges between the vehicle and one or more underwater beacons, the location of which may be
ranges between the vehicle and one or more underwater beacons, the location of which may be
unknown. The aim of the system was to compute in real time an estimate of the position of the AUV
unknown. The aim of the system was to compute in real time an estimate of the position of the
and simultaneously construct a map composed by the estimations of the locations of the beacons.
AUV and simultaneously construct a map composed by the estimations of the locations of the
Experiments were performed with three autonomous marine vehicles following three different
beacons. Experiments were performed with three autonomous marine vehicles following three
trajectories. Minimum-energy estimation, projection filters, and multiple-model estimation
different trajectories. Minimum-energy estimation, projection filters, and multiple-model estimation
techniques were used as observers to compare the results. A combination of those estimators
techniques were used as observers to compare the results. A combination of those estimators produced
produced the best results in terms of error in the trajectory followed by the AUV, which was reduced
the best results in terms of error in the trajectory followed by the AUV, which was reduced from tens of
from tens of meters up to some meters in the first three minutes of the test. Villacrosa et al. [52]
meters up to some meters in the first three minutes of the test. Villacrosa et al. [52] presented a solution
presented a solution to range-only localization using a Sum of Gaussian (SoG) filter. Two variations
to range-only localization using a Sum of Gaussian (SoG) filter. Two variations of the SoG filter were
of the SoG filter were proposed and tested in real experiments, where an AUV performed an
proposed and tested in real experiments, where an AUV performed an autonomous localization and
autonomous localization and homing maneuver. The results in all experiments showed that the AUV
homing maneuver. The results in all experiments showed that the AUV was able to home with an
was able to home with an error smaller than 4 m. Results were corroborated by a vision-based
error smaller than 4 m. Results were corroborated by a vision-based algorithm. Masmitja et al. [50]
algorithm. Masmitja et al. [50] developed a range-only underwater target localization system. A wave
developed a range-only underwater target localization system. A wave glider performed as a moving
glider performed as a moving LBL in simulations and real sea tests. The aim of the study was to
LBL in simulations and real sea tests. The aim of the study was to determine the best path and its
determine the best path and its characteristics, such as number of points, radius and offset, to obtain
characteristics, such as number of points, radius and offset, to obtain the desired target localization
the desired target localization performance. Results showed that with a minimum number of 12
performance. Results showed that with a minimum number of 12 points, radius greater than 400 m
points, radius greater than 400 m and offset as low as possible, the Root-Mean-Square Error (RMSE)
and offset as low as possible, the Root-Mean-Square Error (RMSE) can be of less than 4 m.
can be of less than 4 m.
Zhang et al. [53] presented a new method to solve problems of LBL systems such as communication
Zhang et al. [53] presented a new method to solve problems of LBL systems such as
synchronization among hydrophones. The system considers a Strapdown Inertial Navigation System
communication synchronization among hydrophones. The system considers a Strapdown Inertial
(SINS) and the formation of a matrix of several virtual hydrophones. A single sound source is placed
Navigation System (SINS) and the formation of a matrix of several virtual hydrophones. A single
at the bottom of the sea and sends periodic signals meanwhile a single hydrophone is installed on the
sound source is placed at the bottom of the sea and sends periodic signals meanwhile a single
AUV. In the AUV navigation trajectory, four selected recent positions of the AUV are regarded as four
hydrophone is installed on the AUV. In the AUV navigation trajectory, four selected recent positions
virtual hydrophones of the LBL matrix, which constitute a virtual LBL matrix window. Simulation
of the AUV are regarded as four virtual hydrophones of the LBL matrix, which constitute a virtual
results indicate that the proposed method can effectively compensate for the position error of SINS.
LBL matrix window. Simulation results indicate that the proposed method can effectively
Thus, the positioning accuracy can be confined to 2 m.
compensate for the position error of SINS. Thus, the positioning accuracy can be confined to 2 m.
2.3. Geophysical Navigation
To avoid the problem of INS drifts and the cost of infrastructure for underwater acoustic
systems, geophysical navigation (GN) is a favorable alternative. These approaches match the sensors
Appl. Sci. 2020, 10, 1256 11 of 37

measurements with geophysical parameters such as bathymetry, magnetic field, and gravitational
anomaly contained in a map. Navigation technology based on GN can correct the INS error over a
long time [54], without the need to bring the AUV to the surface. The navigation algorithm estimates
navigation errors, which are sent to the vehicle navigation system to correct its position. By providing
continuous corrections, this method allows the vehicle to maintain required position accuracy without
the need for external sensors, such a GPS. The main limitations of GN is the need for a map available
prior the mission, and the computational complexity of searching for a correlation within the map and
the sensor estimations. In the other hand, the key advantage of GN over other technologies is the large
operating range when in use. Given a map, GN provides bounded localization error with accuracies
dependent on the DR navigation, the map resolution, and the sensitivity of the geophysical parameter
to change vehicle state [55].
GN matching algorithms are classified in two different broad categories: batch methods and
sequential methods [26]. The main algorithms for those methods have been TERCOM and Iterated
Closest Contour Point (ICCP) [56] for batch methods; SITAN, Beijing university of aeronautics and
astronautics Inertial Terrain-Aided Navigation (BITAN) and BITAN-II [25,30,57] for sequential methods.
TERCOM correlates active range sensor observations with a digitized elevation database of terrain.
Meanwhile, the essence of SITAN is the acquisition mode and tracking mode, which are basically a
state-estimation problem based on an Extended Kalman Filter (EKF) after the non-linear system state
equation and observed equation are linearized. Particle Filter (PF) and Bayesian estimators are also
algorithms used in sequential methods.

2.3.1. Gravity Navigation


As mentioned in Section 2.1, the earth’s gravitational field is far from being uniform and, for an
INS, the effects of a change in the local gravitational field are indistinguishable from accelerations of the
vehicle. One alternative is complementing the INS with gravity navigation. At the same time that an
INS estimates the position of the vehicle, a gravity sensor—gravimeter or gradiometer—measures the
gravity and gravity gradient where the AUV is located. A gravimeter measures gravity anomaly or the
deviation in the magnitude of the gravity vector relative to a nominal earth model. A gradiometer is a
pair of accelerometers with parallel input axes on a fixed baseline that measures gravity gradients or the
rate of change of gravity with respect to linear displacement [29]. The difference in the accelerometer’s
output excludes the linear vehicle acceleration but contains the gradient of gravitation across the
baseline. Based on the position and the measurements from the sensor, the database searches for the
best fit of gravity and gravity gradient, and then the optimal matching position will be used to correct
the position error of the INS. Han et al. [58] proposed a matching algorithm for gravity aided navigation,
combining an ICCP algorithm with a Point Mass Filter (PMF) algorithm. The algorithm involved a
two-step matching process. First, the PMF based on vehicle position variable can obtain in real-time an
instructional position given in a large initial position error. Then, the ICCP algorithm can be employed
for further matching. In order to verify the validation of the proposed matching algorithm, a numerical
simulation was performed with a 12 h sailing period, where the speed of the underwater vehicle was
set to 10 nmi/h. Simulation tests indicated that compared with the conventional ICCP algorithm, the
proposed algorithm can achieve better results in terms of latitude and longitude positioning errors,
which were reduced up to 56% and up to 65% when compared with the INS standalone.

2.3.2. Geomagnetic Navigation


Geomagnetic Navigation relies on magnetic sensors and its essence is the Fitting of Two Point
Sets (FTPS) process, where a marine geomagnetic map is used for matching [26]. Geomagnetic filed
has many features which can be applied for matching [59], such as the intensity of the total field F, the
horizontal component H, the north component X, the east component Y, the vertical component Z,
the declination D, the inclination I, the geomagnetic gradient, and so on. These features are shown in
Figure 9.
Appl. Sci. 2020, 10, x FOR PEER REVIEW 12 of 36

the horizontal component H, the north component X, the east component Y, the vertical component
Z, the
Appl. declination
Sci. 2020, 10, 1256D, the inclination I, the geomagnetic gradient, and so on. These features are shown
12 of 37
in Figure 9.

Figure 9. Geomagnetic map features.


Figure 9. Geomagnetic map features.

Zhao et al. [60] studied two matching algorithms, TERCOM and ICCP, used in the geomagnetic
Zhao et al. [60] studied two matching algorithms, TERCOM and ICCP, used in the geomagnetic
matching navigation. An experiment was designed to test the accuracy of the underwater navigation
matching navigation. An experiment was designed to test the accuracy of the underwater navigation
system, using a Differential GPS (DGPS) receiver for providing the exact position of the vehicle. In
system, using a Differential GPS (DGPS) receiver for providing the exact position of the vehicle. In
the results, matching positioning errors in the x direction or in the y direction were less than 100 m.
the results, matching positioning errors in the x direction or in the y direction were less than 100 m.
Authors conclude that both TERCOM and ICCP can achieve credible geomagnetic navigation, with the
Authors conclude that both TERCOM and ICCP can achieve credible geomagnetic navigation, with
difference that ICCP can provide a real-time positioning solution and TERCOM cannot. Ren et al. [7]
the difference that ICCP can provide a real-time positioning solution and TERCOM cannot. Ren et al.
presented a new algorithm to solve FTPS in geomagnetic localization. The algorithm was an improved
[7] presented a new algorithm to solve FTPS in geomagnetic localization. The algorithm was an
version of the ICCP algorithm, based on the algorithm proposed by Menq et al. [61]. Simulation results
improved version of the ICCP algorithm, based on the algorithm proposed by Menq et al. [61].
showed that the ICCP-Menq-algorithm had a better performance than original ICCP algorithm in terms
Simulation results showed that the ICCP-Menq-algorithm had a better performance than original
of dealing with geomagnetic-matching localization. Wang et al. [62] presented a new method which
ICCP algorithm in terms of dealing with geomagnetic-matching localization. Wang et al. [62]
was based on the integration of TERCOM, K-means clustering algorithm and an INS. An experiment
presented a new method which was based on the integration of TERCOM, K-means clustering
was implemented for evaluating the accuracy and the stability of the method proposed. INS and DGPS
algorithm and an INS. An experiment was implemented for evaluating the accuracy and the stability
were set on the surveying vessel. In order to verify the accuracy of this new method, the positioning
of the method proposed. INS and DGPS were set on the surveying vessel. In order to verify the
result from D-GPS is used for comparing with the result of the matching navigation. After completing
accuracy of this new method, the positioning result from D-GPS is used for comparing with the result
the experiment, the error of the new method was under 50 m, meanwhile the traditional method
of the matching navigation. After completing the experiment, the error of the new method was under
showed an error up to 7 times higher.
50 m, meanwhile the traditional method showed an error up to 7 times higher.
2.3.3. Bathymetric Navigation
2.3.3. Bathymetric Navigation
One simple use of bathymetric maps for AUV navigation is the use of isobaths. An isobath is an
One simple
imaginary use connects
curve that of bathymetric maps
all points for AUV
having navigation
the same is the use
depth below of isobaths.
the surface. An isobath
A controller [63]iscan
an
imaginary
be designedcurve
for anthat
AUVconnects allan
to follow points
isobathhaving
whit the
onlysame depthlocalization
low-level below the equipment—such
surface. A controller [63]
as echo
can be designed for an AUV to follow an isobath whit only low-level localization
sounder—and ensures that it never leaves a pre-defined area. Terrain-Referenced Navigation (TRN),equipment—such
as echo sounder—and
Terrain-Aided ensures
Navigation thatand
(TAN), it never leaves a pre-defined
Terrain-Based Navigationarea.
(TBN)Terrain-Referenced Navigation
are all similar approaches for
(TRN), Terrain-Aided Navigation (TAN), and Terrain-Based Navigation (TBN)
GN [64]. These systems estimate the errors in both a main navigation system—such as an INS—and are all similar
the
approaches
terrain for GN
database [64]. These
to provide systems
a highly estimate
accurate the errors
position in both
estimate a main
relative navigation
to the system—such
digital terrain database.
as an INS—and the terrain database to provide a highly accurate position estimate
TBN operates by correlating the actual terrain profile overflow with the terrain information stored relative to thein
digital terrain database. TBN operates by correlating the actual
the terrain database. A basic measurement equation [55] for TBN is given by: terrain profile overflow with the
terrain information stored in the terrain database. A basic measurement equation [55] for TBN is
given by: y = h(x) + e, (8)
y = h(x) + e, (8)
where h(·) is the terrain elevation function, x is the vehicle location, y is the measured terrain height, and
ewhere
is the h(·) is the terrain
measurement elevation
noise. function,
An example x is thecorrelation
of terrain vehicle location,
in oneydimension
is the measured terrain
for a single height,
altimeter
and e is the measurement
measurement is representednoise. An 10.
in Figure example of terrain correlation in one dimension for a single
altimeter measurement is represented in Figure 10.
Appl. Sci. 2020, 10, 1256 13 of 37
Appl. Sci. 2020, 10, x FOR PEER REVIEW 13 of 36

Figure 10.
Figure Terrain-Referenced Navigation.
10. Terrain-Referenced Navigation.

Zhao et
Zhao et al.
al. [65]
[65] worked
worked on on aa TAN
TAN algorithm
algorithm that
that combined
combined TERCOM
TERCOM and and PF.PF. Experiments
Experiments were were
performed to
performed to compare
compare the the proposed
proposed algorithm
algorithm with
with thethe BITAN
BITAN II II algorithm. Results showed
algorithm. Results showed thatthat the
the
North and East position error remained below 100 m for the new algorithm,
North and East position error remained below 100 m for the new algorithm, and the mean error was and the mean error was
less than
less than half
half of
of the
the mean
mean error
error for
for the
the BITAN-II
BITAN-II algorithm.
algorithm. Based Based onon those
those results,
results, authors
authors concluded
concluded
than their
than their system
system was was more
more reliable,
reliable, possessed
possessed aa higher
higher positioning
positioning precision
precision and and aa better
better stability
stability
than the one used for comparison.
than the one used for comparison.
Salavasidis et
Salavasidis et al.
al. [66]
[66] proposed
proposed aa low-complexity
low-complexity PF-based PF-based TAN TAN algorithm
algorithm for for aa long-range,
long-range,
long-endurance deep-rated
long-endurance deep-rated AUV. AUV. The The potential
potential of of the
the algorithm
algorithm was was investigated
investigated by by testing
testing its
its
performance using field data from three deep (up to 3700 m) and long-range
performance using field data from three deep (up to 3700 m) and long-range (up to 195 km in 77 h) (up to 195 km in 77 h)
missions performed
missions performed in in the
the Southern
Southern Ocean.
Ocean. Authors
Authors compared
compared TAN TAN results
results toto position
position estimates
estimates
through DR and USBL measurements. Results showed that TAN
through DR and USBL measurements. Results showed that TAN holds a potential to extend holds a potential to extend underwater
missions to hundreds
underwater missions of tokilometers
hundredswithout the need
of kilometers for surfacing
without the needto re-initialize
for surfacing the to
estimation process.
re-initialize the
For some of the missions analyzed, the RMSE of the TAN algorithm was
estimation process. For some of the missions analyzed, the RMSE of the TAN algorithm was up to 7 up to 7 times lower when
compared
times lowerwithwhen thecompared
DR measurements
with the DR andmeasurements
the absolute water-depth difference
and the absolute was reduced
water-depth up to was
difference 66%
when compared
reduced up to 66% with
whenUSBL measurements.
compared with USBL Meduna et al. [67] proposed
measurements. Meduna et a TRN
al. [67]system for vehicles
proposed a TRN
with low-grade navigation sensors, with the aim of improving navigation
system for vehicles with low-grade navigation sensors, with the aim of improving navigation capabilities of simple DR
systems. The algorithm uses an 8-dimensional particle filter for estimating critical
capabilities of simple DR systems. The algorithm uses an 8-dimensional particle filter for estimating motion sensor errors
observed
critical in thesensor
motion vehicle. Field
errors trials were
observed performed
in the vehicle. Field on an AUV
trials werewith DR navigational
performed on an AUV accuracy
with DR of
5–25% of Distance Traveled (DT). The ability of TRN to provide 5–10 m
navigational accuracy of 5%–25% of Distance Traveled (DT). The ability of TRN to provide 5–10 m navigational precision and an
online return-to-site
navigational precision capability was demonstrated.
and an online return-to-site capability was demonstrated.
2.4. Optical Navigation
2.4. Optical Navigation
Optical technologies are a relevant option to provide information about the environment. These
Optical technologies are a relevant option to provide information about the environment. These
systems can be implemented either with a camera or with an array of optical sensors. Despite the poor
systems can be implemented either with a camera or with an array of optical sensors. Despite the
transmission of light through water, which results in a limited range for imaging systems [68], different
poor transmission of light through water, which results in a limited range for imaging systems [68],
algorithms and techniques are being studied for this purpose. In Figure 11, two examples of optical
different algorithms and techniques are being studied for this purpose. In Figure 11, two examples
systems are shown; where the AUV must detect and follow active landmarks within a structure (a) or
of optical systems are shown; where the AUV must detect and follow active landmarks within a
identify a pattern made with active marks to navigate through it (b).
structure (a) or identify a pattern made with active marks to navigate through it (b).
Appl. Sci. 2020, 10, 1256 14 of 37
Appl. Sci. 2020, 10, x FOR PEER REVIEW 14 of 36

(a) (b)
11. Optical localization systems based on active
Figure 11. active landmarks.
landmarks. (a) AUV following an array of
markers, (b)
active markers, (b) AUV
AUV locating
locating an
an entrance
entrance by
by an
an arrangement
arrangement ofof active
active markers.
markers.

An optical detector array sensor system was presented for AUV AUV navigation
navigation by by Eren
Eren et et al.
al. [69].
The
The performance
performance of of the
the developed
developed optical
optical detector
detector array array was
was evaluated
evaluated for its capability
capability to estimate
estimate
the position, orientation and forward velocity of AUVs AUVs regarding
regarding aa light light source
source fixed
fixed underwater.
underwater.
The
The results
results of of computational
computational simulations
simulations showed showed that that aa hemispherical
hemispherical frame frame design
design with with aa 55 ×× 5
photo-detector
photo-detectorarray arraywas wassufficient
sufficient to to
generate
generate thethe
desired
desiredposition
positionand and
orientation feedback
orientation to theto
feedback AUVthe
with a detection accuracy of 0.2 m in translation (surge, sway, and heave) and 10 ◦ in orientation (pitch
AUV with a detection accuracy of 0.2 m in translation (surge, sway, and heave) and 10° in orientation
and yaw)
(pitch andbased
yaw)on a spectral
based angle mapper
on a spectral algorithm.
angle mapper Some ofSome
algorithm. these of
optical
theseor artificial
optical vision systems
or artificial vision
have been applied to AUVs for different purposes such as docking
systems have been applied to AUVs for different purposes such as docking and recovery. Zhong and recovery. Zhong et al. [70]
et
developed an artificial
al. [70] developed vision system
an artificial vision capable of detecting
system capable a set of lamps
of detecting a set oflocated
lamps around
located the desired
around the
docking location location
desired docking for an AUV. for anTheAUV.AUVThe uses AUVa binocular localization
uses a binocular method to
localization locate to
method thelocate
docking the
platform and navigates to reach it. Navigation lamps were mounted
docking platform and navigates to reach it. Navigation lamps were mounted at the entrance of the at the entrance of the docking
station
dockingasstation
active beacons.
as activeThree common
beacons. Three underwater green lamps green
common underwater were symmetrically positioned on
lamps were symmetrically
the dockingon
positioned model around the
the docking center
model of thethe
around three lamps.
center of theAnthree
experiment
lamps. using a ship model
An experiment using hasabeen
ship
conducted in a laboratory to evaluate the feasibility of the algorithm.
model has been conducted in a laboratory to evaluate the feasibility of the algorithm. The test results The test results demonstrated
that the average
demonstrated localization
that the average error is approximately
localization error is 5approximately
cm and the average 5 cm and relative
the location
average errorrelativeis
approximately 2% in the range of 3.6 m. A similar approach was
location error is approximately 2% in the range of 3.6 m. A similar approach was proposed by Liu et proposed by Liu et al. [71]. A
vision-based frameworkframework
al. [71]. A vision-based for automatically recoveringrecovering
for automatically an AUV byan another
AUV by AUV in shallow
another AUV water was
in shallow
presented
water wasin this work.
presented The work.
in this proposed The framework contains a contains
proposed framework detectiona phasedetectionfor the
phaserobust detection
for the robust
of underwater
detection landmarkslandmarks
of underwater mounted on the docking
mounted on thestation
docking in shallow
station in water,
shallowandwater,
a pose-estimation
and a pose-
phase for estimating
estimation phase forthe pose between
estimating the pose AUVs and underwater
between AUVs andlandmarks.underwaterAt ground experiments,
landmarks. At ground
they observedthey thatobserved
the meanthat position and orientation ◦ and 6.306 mm, respectively,
experiments, the mean position anderrors were 1.823
orientation errors were 1.823° and 6.306 mm,
in the absenceinofthe noise, and of 2.770 ◦ and 9.818 mm, respectively, in the presence of strong noise. Field
respectively, absence noise, and 2.770° and 9.818 mm, respectively, in the presence of strong
experiments were performed
noise. Field experiments weretoperformed
recover a sub-AUV
to recoverby a mother vessel
a sub-AUV in a lake
by a mother usinginthe
vessel proposed
a lake using
framework
the proposed and experiments
framework andshowed
experiments that the algorithm
showed outperformed
that the the state-of-the-art
algorithm outperformed method in
the state-of-the-
terms of localization
art method in terms of error.
localization error.
Although these systems showed a response with high accuracy, accuracy, pre-installed
pre-installed infrastructure
infrastructure is
needed to implement them. An alternative approach is the use
needed to implement them. An alternative approach is the use of a camera or set of cameras of a camera or set of cameras to identify to
features
identify in the environment
features or targets or
in the environment fortargets
the AUV formission.
the AUVMonroy mission. et al.
Monroy[72] developed
et al. [72]adeveloped
micro AUVa
with
microanAUV artificial
withvision system vision
an artificial that allows
system it to follow
that allowsan object by its color.
it to follow an object A Hueby Saturation
its color. AValue Hue
(HSV) filter was implemented on the artificial vision system and
Saturation Value (HSV) filter was implemented on the artificial vision system and a non-linear a non-linear proportional-derivative
controller on the vehicle tocontroller
proportional-derivative stabilize the
on heave and surge
the vehicle movements.
to stabilize A search
the heave andand recovery
surge problem is
movements. A
addressed
search andby an intervention
recovery problemAUV by Prats et
is addressed byal.an[73]. The problem
intervention AUV consisted
by Pratsofetfinding
al. [73]. andThe recovering
problem
aconsisted
flight data recorder.
of finding andThe mission is
recovering compounded
a flight data recorder. by two Thestages:
missionsurvey and intervention.
is compounded by two As the
stages:
system
survey wasand tested on a water
intervention. As thetank, the survey
system stage on
was tested consisted
a wateroftank, a pre-defined
the survey trajectory of the AUV.
stage consisted of a
This trajectory
pre-defined guarantees
trajectory of the that images
AUV. Thistaken by the
trajectory AUV cameras
guarantees cover the
that images complete
taken by the AUV bottom of the
cameras
cover the complete bottom of the tank. Once the survey is complete, the flight data recorder is
identified on the images by applying an HSV histogram and then located; so, the intervention stage
Appl. Sci. 2020, 10, 1256 15 of 37

tank. Once the survey is complete, the flight data recorder is identified on the images by applying
Appl. Sci. 2020, 10, x FOR PEER REVIEW 15 of 36
an HSV histogram and then located; so, the intervention stage can take place. Even though these
techniques
can take place. are quite popularthese
Even though on land and airare
techniques robots,
quiteworking thisland
popular on wayand
hasairseveral
robots,restrictions
working this at
underwater. It is required to know before the mission what the robot is looking for;
way has several restrictions at underwater. It is required to know before the mission what the robot the robot must be
pointed
is looking to an
for;object of potential
the robot must be interest
pointedand
toHSV boundaries
an object must be
of potential manually
interest selected
and HSV until it is must
boundaries well
detected;
be manually it also has theuntil
selected inconvenience that colors
it is well detected; arehas
it also notthe
theinconvenience
same underwater thatascolors
abovearewater, because
not the same
they are strongly affected by illumination.
underwater as above water, because they are strongly affected by illumination.

2.5.
2.5. Simultaneous
Simultaneous Location
Location And
And Mapping
Mapping (SLAM)
(SLAM)
Simultaneous
SimultaneousLocationLocationand
andMapping
Mapping(SLAM)
(SLAM)isisaatechnique
techniquethat
thatconsists
consistsof
of aa mobile
mobile robot,
robot, such
such
as
as an AUV, being placed at an unknown location in an unknown environment and make ittoable
an AUV, being placed at an unknown location in an unknown environment and make it able build to
abuild
consistent map of the environment and determinate its location within this map [74].
a consistent map of the environment and determinate its location within this map [74]. In Figure In Figure 12, a
SLAM solution is represented where an AUV is equipped with a sensor to
12, a SLAM solution is represented where an AUV is equipped with a sensor to explore the explore the environment to
create a digitaltoreconstruction
environment create a digital ofreconstruction
it. Color codesofcan
it. be used
Color to represent
codes information
can be used suchinformation
to represent as distance
between the vehicle
such as distance and obstacles.
between the vehicle and obstacles.

(a) (b)
Figure 12.
Figure 12. Simultaneous
SimultaneousLocation
Locationand
and Mapping
Mapping (SLAM)
(SLAM) of (a)
of (a) an AUV
an AUV equipped
equipped withwith a sensor
a sensor to
to map
map
its its environment
environment and
and (b) (b) digital
digital reconstruction
reconstruction of the of the environment.
environment.

There
Therearearedifferent
differentSLAM
SLAM representation
representationmethods
methods used to reconstruct
used the environment.
to reconstruct EachEach
the environment. one
has
oneits
hasown
its shortcomings and advantages,
own shortcomings choosing
and advantages, the bestthe
choosing one depends
best on the application
one depends desired
on the application
which can be inspection, navigation, interaction, etc. The principal representation methods
desired which can be inspection, navigation, interaction, etc. The principal representation methodsare listed
in
areTable 3.in Table 3.
listed

Table 3. SLAM representation methods.


Table 3. SLAM representation methods.
Method
Method Type
Type Description
Description Applications
Applications
Models the environment
Models the environment as a set of as a
set of landmarks extracted Localization
Localization andand
landmark-based maps
landmark-based maps 2D/3D
2D/3D landmarks extracted from features as
from features as points, lines, mapping
mapping [75].[75].
points, lines, corners,
corners, etc. etc.
Discretizes the environment in cells
Discretizes the environment in Exploring and
Occupancy gridmaps
maps 2D2D and assigns a probability of Exploring and mapping
Occupancy grid cells and assigns a probability mapping [76].
[76].
occupancy
of occupancyof each cell.
of each cell.
Describes the 3-Dthe
Describes geometry by a by
3-D geometry large Obstacle avoidance
RawDense
Raw Dense Obstacle avoidance and
3D3D unstructured set of points
a large unstructured set or
of and visualization
Representations
Representations visualization [77].
points or polygons.
polygons. [77].
Boundary
BoundaryandandSpatial- Generates representations of Obstacle avoidance
Obstacle avoidance and
Spatial-Partitioning Generates representations of
Partitioning Dense 3D3D boundaries, surfaces, and and manipulation
manipulation [78].
Dense Representations boundaries, surfaces,
volumes.and volumes.
Representations [78].

Underwater SLAM can be categorized in acoustic-based and vision-based [38]. The perception
of optical devices is constrained by poor visibility and noise produced by sunlight in shallow waters.
Moreover, they can provide high frequencies and high resolution for a lower cost than an acoustic
Appl. Sci. 2020, 10, 1256 16 of 37

Underwater SLAM can be categorized in acoustic-based and vision-based [38]. The perception of
optical devices is constrained by poor visibility and noise produced by sunlight in shallow waters.
Moreover, they can provide high frequencies and high resolution for a lower cost than an acoustic
system. On the other hand, a high-definition FLS can provide a promising alternative for working
under challenging conditions.
In [79], Hernández et al. presented a framework to give an AUV the capability to explore unknown
environments and create a 3D map simultaneously with an acoustic system. The framework comprises
two main functional pipelines. The first, provides the AUV with the capacity of creating an acoustic
map online, while planning collision-free paths. The second pipeline builds a photo-realistic 3D
model using the gathered image data. This framework was tested in several sea missions where
results validated its capabilities. Palomer et al. [80] used a multi-beam echo-sounder to produce high
consistency underwater maps. Since there is not a general method to evaluate consistency of a map,
authors computed the consistency-based error [81] and proposed a 3D statistic method named #Cells.
The statistic method consisted in counting the number of cells that each bathymetric map occupies
within the same 3D grid. If a map occupies less cells, it is probably because their point clouds are more
densely packed due to a better registration. The algorithm was tested using two real world datasets.
Three surfaces were created for different navigation methods: DR, USBL and the proposed algorithm.
Regarding the number of occupied cells, the proposed method occupied 5.76% less cells than a DR
model, and 7.24% less than the USBL model.
Gomez-Ojeda et al. [82] implemented a visual-based SLAM algorithm. Authors compared
a stereo Point and Line SLAM (PL-SLAM) with an Orientated FAST and Rotated BRIEF (ORB)
SLAM, a point-only system and a line-only system. Results showed superior performance of the
PL-SLAM approach relatively to ORB-SLAM, in terms of both accuracy and robustness in most of the
dataset sequences. The mean translational error was minor for PL-SLAM in 55% of the sequences
and the mean rotational error in the 73% of the cases. Nevertheless, that work was not tested for
underwater applications. After that, Wang et al. [83] proposed a method to improve the accuracy of
vision-based localization systems in feature-poor underwater environments using PL-SLAM algorithm
for localization. Three experiments were performed, including walking along the wall of a pool,
walking along a linear route, and walking along an irregular route. The experimental results showed
that the algorithm was highly robust in underwater low-texture environments due to the inclusion of
line segments. At the same time, the algorithm achieved a high accuracy of location effectively. The
attitude error—computed as shown in Equation (8)—was 0.1489 m, which represented the 2.98% DT.
q
Attitude error = (error_x)2 + (error_y)2 + (error_z)2 (9)

Authors conclude that it can be implemented in the navigation and path planning of AUVs in
the future. With the aim to explore the capabilities of visual-based SLAM in real and challenging
environments, Ferrera et al. [84] proposed what they considered as the first underwater dataset
dedicated to the study of underwater localization methods from low-cost sensors. The dataset has
been recorded in a harbor and provides several sequences with synchronized measurements from
a monocular camera, a Micro-Electro-Mechanical System-Inertial Measurement Unit (MEMS-IMU)
and a Pressure Sensor (PS). Among the sensors used in the dataset acquisition were a 20 frames per
second (fps), 600 × 512 px monochromatic camera, and a 200 Hz IMU. As a benchmark, authors ran
experiments using state-of-the-art monocular SLAM algorithms, and then compared ORB-SLAM,
Semi-direct Visual Odometry (SVO) and Direct Sparse Odometry (DSO). Results showed an absolute
translation error between 24–52 cm, 24–67 cm, and 2–56 cm for each of the methods applied, which
highlighted the potential of vision-based localization methods for underwater environments. With the
same idea, Joshi et al. [85] formed their own datasets from an underwater sensor suite—equipped with
a 100 Hz IMU and a 15 fps, 1600 × 1200 px stereo camera—operated by a diver, an underwater sensor
suite mounted on a diver propulsion vehicle, and an AUV. Experiments were conducted for each
Appl. Sci. 2020, 10, 1256 17 of 37

dataset considering the following combinations: monocular; monocular with IMU; stereo; and stereo
with IMU, based on the modes supported by each Visual Odometry (VO) or Visual Inertial Odometry
(VIO) algorithm. Results showed that DSO and SVO, despite quite often fail to track the complete
trajectory, had the best reconstructions for the tracked parts and, as expected, stereo performed better
than monocular. The results confirmed that incorporating IMU measurements drastically lead to
higher
Appl. Sci.performance, in comparison
2020, 10, x FOR PEER REVIEW to the pure VO packages. 17 of 36

2.6. Sensor Fusion


2.6. Sensor Fusion
As
As established
established in in Section
Section 2.1,
2.1, the main inconvenient
the main inconvenient of of an
an INS
INS isis that
that the
the position
position and
and orientation
orientation
accuracy drifts over time, so, to keep it under the limits expected for safe
accuracy drifts over time, so, to keep it under the limits expected for safe AUV navigation, the AUV navigation, the system
system
must
must correct its error by comparing its position estimation with a fixed location measured from
correct its error by comparing its position estimation with a fixed location measured from
additional
additional sensors—such
sensors—such as as aa GPS—periodically.
GPS—periodically. To To overcome
overcome this,this, the
the INS
INS can
can be
be fused
fused with
with other
other
sensors. There are
sensors. There are two
two main
main schemes
schemes for for sensor
sensor fusion: loosely coupled
fusion: loosely coupled (LC)(LC) and
and tightly
tightly coupled
coupled
(TC).
(TC). The basic difference is the data shared by the sensors. In an LC scheme, a solution for
The basic difference is the data shared by the sensors. In an LC scheme, a solution for the
the
position
position oror orientation
orientation of of the
the AUV
AUV is is obtained
obtained forfor each
each sensor
sensor individually
individually and and then
then blended
blended using
using aa
filter—such
filter—such as as aa Kalman
Kalman Filter
Filter (KF)—to
(KF)—to obtain
obtain aa more
more accurate
accurate andand reliable
reliable solution.
solution. InIn aa TC
TC scheme,
scheme,
raw
raw measurements
measurements of of the
the sensors
sensors are are processed directly on
processed directly on the
the filter
filter to
to overcome
overcome problems
problems as as poor
poor
signal quality or limited coverage thanks to the filter’s capabilities to predict the
signal quality or limited coverage thanks to the filter’s capabilities to predict the pose of the vehicle. pose of the vehicle.
In
In this
this case,
case, aa more
more robust
robust filter
filter is
is needed
needed so so variants
variants of of the
the KFKF are
are commonly
commonly used used [86],
[86], such
such as
as an
an
EKF
EKF oror Unscented
Unscented Kalman
Kalman Filter
Filter (UKF). Filter selection
(UKF). Filter selection is is essential
essential to to get
get a a better
better solution
solution forfor the
the
vehicle’s
vehicle’s pose
pose and,
and, besides
besides thethe sensor
sensor fusion
fusion approach
approach adopted,
adopted, accuracy,
accuracy, numerical efficiency, and
numerical efficiency, and
computational
computational complexity must be considered. LC and TC schemes are represented in Figure
complexity must be considered. LC and TC schemes are represented in Figure 13
13 with
with
velocity
velocity estimation
estimation fromfrom anan INS
INS and
and aa Doppler
Doppler Velocity
VelocityLogger
Logger(DVL)(DVL)as asexample.
example.

Figure 13. Loosely Coupled (LC) vs. Tightly Coupled (TC) sensor fusion schemes.

Most of the sensor fusion systems for AUV navigation navigation areare those
those ofof an INS aidedaided byby a DVL;
typically, the fusion
fusion isis under
underan anLCLCscheme
scheme[87–89]
[87–89]withwith a linear
a linear filter.
filter. However,
However, in cases
in cases wherewhere
the
the
DVLDVL measurements
measurements are are limited,
limited, an LC
an LC algorithm
algorithm leaves
leaves the the
INSINS to work
to work alone.
alone. ThisThis produces
produces an
an accumulative error which gets bigger with time. Considering this, Liu et al. [90]
accumulative error which gets bigger with time. Considering this, Liu et al. [90] explored a TC scheme explored a TC
scheme as an alternative.
as an alternative. This approach
This approach includesincludes depth updates
depth updates given given by a depth
by a depth sensor sensor
among among
to rawto
raw measurements
measurements fromfrom the DVL
the DVL to help
to help the INS
the INS andandavoidavoid
the the
driftdrift
causedcaused by limited
by limited measurements
measurements on
on
thethe
LC LC approach.
approach. Different
Different trajectories
trajectories werewere simulated
simulated for for an AUV
an AUV including
including a straight
a straight lineline
at aatfor
a
for 1,800
1,800 s with
s with a constant
a constant velocity.
velocity. For simulations,
For simulations, the update
the update frequencies
frequencies of theofINS,
the DVL,
INS, DVL,
and PS and PS
were
200 Hz, 1 Hz, and 1 Hz, respectively. For x, y, and z axes, a gyro drift of 0.01°/h and a 100 µg
accelerometer bias were introduced as INS errors; 0.002 m/s as a constant DVL error, and 0.05 m as a
constant PS error. The results showed a cumulative error of 1000 m at the end of the trajectory for the
LC approach and only 10 m in the TC case. Same disadvantages of the LC fusion of an INS/DVL
system was addressed by Tal et al. [91]. In their work, they designed a navigation system based on a
Appl. Sci. 2020, 10, 1256 18 of 37

were 200 Hz, 1 Hz, and 1 Hz, respectively. For x, y, and z axes, a gyro drift of 0.01◦ /h and a 100 µg
accelerometer bias were introduced as INS errors; 0.002 m/s as a constant DVL error, and 0.05 m as a
constant PS error. The results showed a cumulative error of 1000 m at the end of the trajectory for the
LC approach and only 10 m in the TC case. Same disadvantages of the LC fusion of an INS/DVL system
was addressed by Tal et al. [91]. In their work, they designed a navigation system based on a 150 Hz
INS aided with a 1 Hz DVL, a 0.5 Hz magnetometer, and a 0.25 Hz PS under an Extended Loosely
Coupled (ELC) approach within an EKF. They focused their work in exploring cases where only partial
measurements of the DVL were available and used external information to complete the velocity
calculation of the vehicle. To test their system, different trajectories of a vehicle were simulated. Results
showed a better performance by the ELC scheme, with improvements up to 38% on Root-Mean-Square
Errors when compared with the standalone INS and up to 12% compared with a TC scheme.
Another approach is the fusion of INS with acoustic systems. In [92,93], Zhang et al. investigated
the use of an AUV positioning method based on a SINS and an LBL under a TC algorithm. Authors were
looking to solve position error accumulation of AUVs. They compared the TC and LC approaches by
simulating an AUV trajectory under different conditions, such as changing the number of hydrophones
available. Test results demonstrated that the system proposed in this work is more reliable than LC
approach since the error on the trajectory—particularly when approaching or leaving the hydrophones
array—was up to 50% lower.
Artificial vision is also being fused with INS to improve its performance. Manzanilla et al. [94]
addressed autonomous navigation for AUVs. They used artificial vision fused with an IMU on a LC
algorithm. Parallel Tracking and Mapping (PTAM) was implemented to localize the vehicle respect to
a visual map, using a single camera—15 fps, 640 × 480 px. Then, an EKF was used to fuse the visual
information with data from an IMU, to recover the scale of the map and improve the pose estimation.
In this work, fully autonomous trajectory tracking was successfully achieved and compared using
standalone PTAM and the sensor fusion approach. Results showed that the trajectory followed by the
vehicle using sensor fusion has errors not bigger than 20 cm whilst the standalone PTAM drift up to
60 cm.
The EKF is the most widely used nonlinear filtering approach in TC schemes. EKF is based on
a simple linear approximation to the nonlinear equations. However, there are too many unknown
disturbance factors at underwater, and they cannot be established in suitable mathematical models
in the kinematic equation. Other alternatives to a traditional EKF have been explored. Li et al. [95]
proposed a multi-model EKF integrated navigation algorithm. It was designed to solve the harsh
underwater environment problems. This algorithm, based on the probabilistic data association theory,
was compared with standard EKF in a lake trial using an AUV equipped with an IMU, an AHRS, and
an LBL system with four acoustic beacons. Results showed a better performance by the multi-model
EKF since the error between true positions and estimations were less than 12 m. The algorithm showed
to be able to overcome disturbances that produced peaks of over 400 m on traditional EKF estimations.
Chen et al. [96] worked on another alternative to an EKF for TC SINS/LBL navigation systems. Instead
of applying an EKF they used a near-real-time (NRT) Bayesian framework. They compared NRT
framework with EKF approaches with an accurate and a poor initialization. Results showed a better
performance by the NRT solution with an 80% reduction of the measurement residuals with a poor
accurate yaw error initialization.
The main alternatives for sensor fusion based on an INS are summarized in Figure 14.
Appl. Sci. 2020, 10, 1256 19 of 37
Appl. Sci. 2020, 10, x FOR PEER REVIEW 19 of 36

Figure
Figure 14.
14. Sensor
Sensor fusion
fusion alternatives
alternatives for
for AUV
AUV positioning.
positioning.

2.7. Localization
Localization and
and Navigation
Navigation Overview
General conclusions
General conclusions in terms of sensors performance for non-traditional AUV navigation and
localization technologies are shown in Table
Table 4.
4.
After the literature review, it can be considered that acoustic-based technologies still a reliable
alternative for AUV localization and navigation;
Table 4. Technologies although
for AUV they require
localization more infrastructure than others.
and navigation.
Future work must consider the possibility to include them in teams of collaborative AUVs. To achieve
Navigation Information
Approaches
that, acoustic systems must overcome low updated rates Accuracy Range
and limited accuracy Results
(at long ranges) in order
Technology Available
to avoid collisions in AUV formations, especially when they are navigating in a few meters of each other.
Depending on
On the other hand, visual-based localization technologies—includingFrom 5 m up gained attention
SLAM—have
distance from
in recent years. These technologies Distance to hundreds
can estimate both, position and orientation, contrary to acoustic
obstacles, from 5– Experimental in
methods. They also SONAR from
reach a higher accuracy of meters navigation of AUVs.
which is critical for the collaborative
10 cm to more than real conditions.
obstacles.
Thus, it is an interesting and reliable option for some particular tasks under fromspecific environments.
a meter (10–120
Nevertheless, most of them are on an early level of readiness since they obstacles.
have been tested only in
cm).
very controlled environments. It seems difficult for visual-based systems to overcome the challenging
Acoustic Depending on
conditions of underwater. Moreover, it is hard for researches to find the proper conditions to test their
distance from
visual-based and visual-SLAM algorithms in real underwater conditions. To deal with that, some
Acoustic hydrophone array Up to tens of
datasets are being collected such as the AQUALOC dataset [97], which is dedicated to the developmentin
Experimental
range (LBL, Position and the frequency, meters from
of SLAM methods for underwater vehicles navigating close to the seabed. The Autonomous Field
real conditions.
SBL, USBL). from some the array.
Robotics Laboratory (AFRB) [98] has some datasets available for the same purpose.
centimeters up to
tens of meters.
Gravity, Meters. Depending Simulation,
Kilometers
geomagnetic, on the map Experimental
Geophysical Position from initial
TAN, TRN, resolution and under controlled
position.
TBN filter applied. conditions.
Position and Simulation,
Up to 20 cm for
Light orientation 1–20 m from Experimental
Optical position and 10° for
sensors. relative to a markers. under controlled
orientation.
target. conditions.
Appl. Sci. 2020, 10, 1256 20 of 37

Table 4. Technologies for AUV localization and navigation.

Navigation Information
Approaches Accuracy Range Results
Technology Available
Depending on
From 5 m up to
distance from Experimental
Distance from hundreds of
SONAR obstacles, from 5–10 in real
Acoustic obstacles. meters from
cm to more than a conditions.
obstacles.
meter (10–120 cm).
Depending on
distance from
Acoustic range hydrophone array Up to tens of Experimental
(LBL, SBL, Position and the frequency, meters from the in real
USBL). from some array. conditions.
centimeters up to
tens of meters.
Simulation,
Gravity, Meters. Depending
Kilometers Experimental
geomagnetic, on the map
Geophysical Position from initial under
TAN, TRN, resolution and filter
position. controlled
TBN applied.
conditions.
Simulation,
Up to 20 cm for Experimental
1–20 m from
Light sensors. Position and position and 10◦ for under
Optical markers.
orientation orientation. controlled
relative to a conditions.
target.
Up to 1 cm for Experimental
1–20 m from
Cameras position and 3◦ for in real
markers.
orientation. conditions.
Up to tens of Experimental
Acoustic Position and From some meters from in real
SLAM orientation centimeters up to targets. conditions.
relative to the more than a meter.
Simulations,
mapped
Experimental
environment. 1–10 m from
Cameras under
targets.
controlled
conditions.
Depending on the
approach and filter
Position, Kilometers
applied, Simulations,
Sensor fusion ELC, LC, TC. orientation and from initial
accumulative error Experimental.
velocity. position.
can be reduced up to
some meters (5–20)

3. Collaborative AUVs
Once the navigation and localization problem for the AUVs is solved, a scheme for collaborative
work between a group of robots can be proposed. Collaborative work refers to an interaction of
two or more AUVs to perform a common task which can be collaborative navigation, exploration,
target search, and object manipulation. Using a team of AUVs navigating on a certain formation has
the potential to significantly expand the applications for underwater missions; such as those that
require proximity to the seafloor or to cover a wide area for search, recovery, or reconstruction. At
first, researchers focused their work on how multiple vehicles could obtain data simultaneously from
the same area of interest. Nowadays, their focus has moved to the trajectory design and operation
strategies for those multi-vehicle systems [99].
Appl. Sci. 2020, 10, 1256 21 of 37

3.1. Communication
The rapid attenuation of higher frequency signals and the unstructured nature of the undersea
environment make difficult to establish a radio communication system for AUVs. For those reasons,
wireless transmission of signals underwater—especially for distances longer than 100 m—relies almost
only on acoustic waves [14,100]. Underwater acoustic communication using acoustic modems consists
of transforming a digital message into sound that can be transmitted under water. The performance of
these systems changes dramatically depending on the application and the range of operation [101].
The main factors to choose an underwater acoustic modem are:

• Application: Consider the type and length of message (Command and control messages, voice
messages, image streaming, etc.) frequency of operation and operating depth.
• Cost: Depending on the complexity and performance, from some hundreds up to $50,000 (USD).
• Size: Usually cylindrical, with lengths from 10 cm to 50 cm.
• Bandwidth: Acoustic modems can perform underwater communication at up to some kb/s.
Length of the message and time limitations must be considered
• Range: Range of operation for the vehicle’s communication has impact on the cost of the system.
Acoustic modems are suitable from short distances up to tens of km. Considerer than a longer
range will increase the latency and power consumption of the system.
• Power consumption: Depending on the range and modulation, the power consumption is in the
range of 0.1 W to 1 W in receiving mode and 10 W to 100 W in transmission mode.

Table 5 contains a few options of acoustic modems commercially available.

Table 5. Commercial acoustic modems.

Name Max Bit Rate (bps) Range (m) Frequency Band (kHz)
Teledyne Benthos ATM-925 [102] 360 2000–6000 9–27
WHOI Micromodem [103] 5400 3000 16–21
Linkquest UWM 1000 [104] 7000 350 27–45
Evologics S2C R 48/78 [105] 31,200 1000 48–78
Sercel MATS 3G 34 kHz [106] 24,600 5000 30–39
L3 Oceania GPM-300 [107] 1200 45,000 Not specified
Tritech Micron Data Modem [108] 40 500 20–28
Bluerobotics Water Linked M64
64 200 100–200
Acoustic Modem [109]

The working principles of underwater acoustic communication can be described as follows [110]:
First, information is converted into an electrical signal by an electrical transmitter. Second, after digital
processing by an encoder, the transducer converts the electrical signal into an acoustic signal. Third,
the acoustic signal propagates through the medium of water and propagates the information to the
receiving transducer. In this case, the acoustic signal is converted into an electrical signal. Finally, after
the digital signal is deciphered by the decoder, the information is converted to an audio, text or picture
by the electrical receiver.
Acoustic communications face many challenges, such as, small bandwidth, low data rate, high
latency, and ambient noise [111]. These shortcomings might provoke that a cycle of communication
in a collaborative mission take several seconds, or even more than a minute. Considering these,
Yang et al. [112] analyzed formation control protocols for multiple underwater vehicles in the presence of
communication flaws and uncertainties. The error Port-Hamiltonian model about the desired trajectory
was introduced and then, with the existence of relative information constraints or uncertainties, the
formation control law was achieved by solving specific limitations of the linear matrix inequality
problem. Abad et al. [113] introduced a communication scheme between the AUVs and a unique
representation of the overall vehicle state that limits message size. To limit data sent, every reported
Appl. Sci. 2020, 10, 1256 22 of 37

position and path plan is encoded using a grid encoding scheme. Authors implemented a decentralized
model predictive control algorithm—centralized schemes are typical for swarms of AUVs—to control
teams of AUVs that optimizes vehicle control inputs to account for the limitations of operating in an
underwater environment. They simulate their proposal and showed the effectiveness of their approach
in a Mine/Countermine mission. Other way to deal with acoustic communication issues was presented
by Hallin et al. [114]. They proposed that enabling the AUVs to anticipate acoustic messages would
improve their ability to successfully complete missions. They outlined an approach to AUV message
anticipation in AUVish-BBM (BBM suffix includes the initials of the researchers directly involved
in dialect development: Beidler, Bean and Merrill [115]), an acoustic communications language
for AUVs [116], based on a University of Idaho-developed paradigm called Language-Centered
Intelligence (LCI). They demonstrated a new application of LCI in the field of cooperative AUV
operations and argued that message anticipation can be effectively deployed to correct message errors.
The structure, content, and context of individual messages of AUVish-BBM, together with its associated
communication protocol, supply a systematic framework that can be utilized to anticipate messages
expressed by AUVs performing collaborative missions.
The absence of an underwater communication standard has been a problem for collaborative
teams of AUVs. In 2014, Potter et al. presented the JANUS underwater communication standard [117],
a basic and robust tool for collaborative underwater communications designed and tested by the NATO
Centre for Maritime Research and Experimentation. This opened the possibilities for simple integration
of different robots using this standard [118–121] for collaborative tasks as underwater surveillance.
To improve the performance of underwater communication, optical technologies have been
tested either stand-alone [122,123] or as a complement for an acoustic system [100]. Laser submarine
communication has some advantages such as a high bit rate, higher security and broad bandwidth.
Blue-green light (whose wavelength is 470–580 nm) penetrates water better and its energy attenuation
is less than any other wavelength light [124]. Thus, researchers have explored optical underwater
communication systems based in blue-green light, to allow an underwater vehicle to receive a message
from an aerial/spatial system at any depth despite its actual speed, course and distance from the
transmitter. Wiener et al. [122] were seeking for a system to deliver a message from a satellite to a
submarine, avoiding the need for the submarine to navigate close to the surface for retrieving the
message as happened with the radio-frequency systems used at the time. Authors stated that blue
light has the potential to accomplish the result expected in the future. However, their research was
only a brief representation of what could be expected when working in such a difficult environment.
Puschell et al. [123] performed the first demonstration of a two-way laser communication between a
submarine vehicle and an aircraft; and concluded that a blue-green laser communication system could,
someday, reach operational requirements. Sangeetha et al. [125] made experiments to study the optical
communication between an underwater body and a space platform using a red laser with a 635 nm
wavelength. Results showed that the performance of the red-light system was lower than the expected
for a blue-light system in terms of the attenuation coefficient observed. Corsini et al. [126] worked on
an optical wireless communication system where both, transmitter and receiver, where at underwater.
The optical signal with a 470 nm wavelength was obtained modulating two LED arrays and received
by an avalanche photodiode module. Error free transmission was achieved in the three configurations
under test (6.25 Mbit/s, 12.5 Mbit/s, and 58 Mbit/s) through 2.5 m clean water.
Despite some authors as Wiener and Puschell have claimed that laser communication systems
for underwater vehicles could be a possibility, recent studies showed that technology still limited.
Laser-based systems cannot reach a target with a more than a few tens of meters depth under ideal
conditions [127]. Thus, Farr et al. [100] developed an optical communication system that complements
and integrates an acoustic system. The result was an underwater communication system capable
to offer high data rates and low latency when within optical range; combined with long range and
robustness of acoustics when outside of optical range. Authors have demonstrated robust multi-point,
Appl. Sci. 2020, 10, 1256 23 of 37

low power omnidirectional optical communications over ranges of 100 m at data rates up to 10 Mb/s
using blue-green emitters.

3.2. Collaborative Navigation


Groups of AUVs can work together under different navigation schemes, which are generally
parallel or leader-follower [110]. On a parallel formation (shown in Figure 15), all AUVs are equipped
with the same systems and sensors, to locate and navigate themselves precisely and to communicate
Appl. Sci.
with its 2020, 10, x FOR
neighbor PEER REVIEW
AUVs. 23 of 36
Appl. Sci. 2020, 10, x FOR PEER REVIEW 23 of 36

Figure 15.
Figure 15. Parallel
Parallel navigation
navigation of
of AUVs.
AUVs.
15.

In aa leader-follower
In leader-follower scheme (shown
(shown in Figure
Figure 16), leader
leader AUV is is equipped with
with high-precision
In a leader-follower scheme
scheme (shown in in Figure 16),
16), leader AUV
AUV is equipped
equipped with high-precision
high-precision
instruments meanwhile
instruments meanwhile follower
follower AUVs
AUVs are are equipped
equipped with
with low-precision
low-precision equipment
equipment [128].
[128].
instruments meanwhile follower AUVs are equipped with low-precision equipment [128].
Communication
Communication is only required between the vehicle leader and its followers, there is no need for the
Communication isisonly onlyrequired
requiredbetween
between thethe vehicle
vehicle leader
leader and and its followers,
its followers, there there is nofor
is no need need
the
followers
for the to communicate
followers to with each
communicate other.
with each other.
followers to communicate with each other.

Figure 16. Leader-follower navigation of AUVs with AUV 1 as a leader of the formation.
formation.
Figure 16. Leader-follower navigation of AUVs with AUV 1 as a leader of the formation.

The lower cost and the reduced communications needs make the leader-follower scheme the
The lower cost and the reduced communications needs make the leader-follower scheme the
main navigation control method of AUVs. Its basic principles and algorithms are relatively mature
main navigation control method of AUVs. Its basic principles and algorithms are relatively mature
[110]. However, unstable communications and communication delays are still challenging problems
[110]. However, unstable communications and communication delays are still challenging problems
and need to be addressed. Since there are problems with signals when multiple systems emit at the
and need to be addressed. Since there are problems with signals when multiple systems emit at the
same time, co-localization of AUVs is mostly based on time synchronization. However, time
same time, co-localization of AUVs is mostly based on time synchronization. However, time
synchronization methods have some shortcomings such as the need for AUVs to go to surface to
synchronization methods have some shortcomings such as the need for AUVs to go to surface to
Appl. Sci. 2020, 10, 1256 24 of 37

The lower cost and the reduced communications needs make the leader-follower scheme the main
navigation control method of AUVs. Its basic principles and algorithms are relatively mature [110].
However, unstable communications and communication delays are still challenging problems and
need to be addressed. Since there are problems with signals when multiple systems emit at the
same time, co-localization of AUVs is mostly based on time synchronization. However, time
synchronization methods have some shortcomings such as the need for AUVs to go to surface
to receive the synchronization signal. As an alternative, Zhang et al. [129] studied multi-AUVs
collaborative navigation and positioning without time synchronization. Authors established a
collaborative navigation positioning model for multi-AUVs and designed an EKF for collaborative
navigation. This design only needs the time delay of the AUV itself and does not need to consider
whether the AUV has synchronized with others. In simulated experiments, they compared the
precision of their algorithm with a prediction model and results showed that, even when error increases
over time, the precision of co-localization without time synchronization was higher. Yan et al. [130]
addressed the problems of leader-follower AUV formation control with model uncertainties, current
disturbances, and unstable communication. The effectiveness of the method is simulated by tracking a
spiral helix curve path with one leader AUV and four follower AUVs. Considering model uncertainties
and current disturbances, a second-order integral AUV model with a nonlinear function and current
disturbances was established. The simulation results showed that leader-follower AUV formation
controllers are feasible and effective. After an adjustment period, all follower AUVs can converge to
the desired formation structure, and the formation can keep tracking the desired path.
Cui et al. [131] focused on the problem of tracking control for multi-AUV systems and proposed
an adaptive fuzzy-finite time control method. In this algorithm, algebraic graph theory is combined
with a leader-follower architecture for describing the communication of the system. Then, the error
compensation mechanism is introduced. Finally, the application of finite time and fuzzy logic system
improves the convergence rate and the robustness of multi-AUV system. The effectiveness of the
proposed algorithm was illustrated by simulation. Choosing the architecture of 4 AUVs including 1
leader and 3 followers. In order to avoid the unknown internal and external interferences, the algebraic
graph theory and a fuzzy logic system technique are integrated into the distributed controllers. The
simulation results demonstrate the effectiveness of the proposed algorithm and robustness of the
multi-AUV system with a faster convergence speed compared with others algorithms.
When AUVs navigate in closed formations, the delay between the transmission and reception of
the acoustic signals represents a high risk. Therefore, a solution with a response time significantly
faster must be explored. Bosh J. et al. [11] developed an algorithm for AUVs navigating on a close
formation, where light markers and artificial vision are used to allow the estimation of the pose of a
target vehicle at short ranges with high accuracy and execution speed. In the experiments presented,
the filtered pose estimates were updated at approximately 16 Hz, with a standard deviation lower
than 0.2 m in the distance uncertainty between vehicles, at distances between 6 m and 12 m. As
expected, the results showed that the system performs adequately for vehicle separations smaller than
10 m, while the tracking becomes intermittent for longer distances due to the challenging visibility
conditions underwater.
Other alternatives for cooperative navigation of AUVs is the use of systems that allow the vehicles
within a team to help each other with their localization. Teck et al. [132] proposed a TBN system for
cooperative AUVs. The approach consists on an altimeter and acoustic modem equipped on each
vehicle and a bathymetric terrain map. The localization is performed via decentralized particle filtering.
The vehicles in the team are assumed to have their system time synchronized. A simple scheduling is
adopted so that each vehicle in the team broadcasts its local state information sequentially using acoustic
communication. This information includes the vehicle current position, the filter estimated covariance
matrix, and the latest water depth measurement. When the acoustic signal is received by another
vehicle the time-of-arrival can be calculated to determine the inter-vehicle distance. Results showed
that localization performance improves as the number of the vehicles in the team increase, at least up
Appl. Sci. 2020, 10, 1256 25 of 37

to four, and when they are in range for a proper acoustic communication. The average positioning
error was in the range of a few meters in those conditions. Tan et al. [133] developed a cooperative path
planning for range-only localization. Authors explored the use of a single-beacon vehicle for range-only
localization to support other AUVs. Specifically, they focused on cooperative path-planning algorithms
for the beacon vehicle using dynamic programming formulations. These formulations take into account
and minimize the positioning errors being accumulated by the supported AUV. Implementation of
the cooperative path-planning algorithms was in a simulated environment. The simulations were
conducted with different types of ranging aids, each transmitted from a single beacon. The ranging
aids used were: single fixed beacon, circularly moving beacon, and cooperative beacon. Experimental
results were also obtained by a field trial was conducted near Serangoon Island, Singapore. Average
error was
Appl. Sci. reduced
2020, 10, x FORup to REVIEW
PEER 19.1 m over a traveled distance of 1.5 km. De Palma et al. [134] made a
25 of 36
similar approach. The problem addressed by the authors consisted of designing a relative localization
vehicles for
solution measuring
a networked mutual
groupranges. Themeasuring
of vehicles aim of mutual
the project
ranges.was
The toaimexploit inter-vehicle
of the project was to
communications
exploit inter-vehicle to communications
enhance the range-based
to enhancerelative position estimation.
the range-based Vehicles
relative position are considered
estimation. Vehicles
capable
are to know
considered their own
capable position,
to know orientation,
their own position,and velocity regarding
orientation, and velocity a common
regardingframe. Such
a common
vehiclesSuch
frame. share their own
vehicles share information through their
their own information communication
through channel channel
their communication and theyand
canthey
obtain
can
measurements
obtain measurements of their relative
of their Euclidean
relative Euclideandistance with
distance respect
with respectto toseveral
severalother
other agents.
agents. The
connection topology of the agents was represented through a relative relative position
position measurement
measurement graph
and a simulation, relative to a group of 44 agents, agents, was
was performed
performed with
withdifferent
different connection
connection topologies.
topologies.
During the simulation, z error error remained
remained in in the
the range
range of
of ±± 1 m after a time-lapse of 3000 s.

3.3. Collaborative Missions


3.3. Collaborative Missions
Surveillance and intervention are typically the kinds of missions designed
designed for
for teams
teams of
of AUVs.
AUVs.
Surveillance missions require the AUVs to detect, localize, follow, and classify targets, inspect or
explore
explore the ocean. Meanwhile, intervention missions require the AUVs to interact with objects within
the environment. Examples of both missions are represented
represented in
in Figure
Figure 17.
17.

(a) (b)
Figure 17. Collaborative
Collaborativemissions
missionsfor
forAUVs.
AUVs.(a) (a)
Cooperative inspection
Cooperative of subsea
inspection formations.
of subsea (b)
formations.
Cooperative
(b) object
Cooperative recovery.
object recovery.

3.3.1. Search Missions


3.3.1. Search Missions
A
A good
good search
search mission
mission needs
needs to to minimize
minimize thethe number
number of of vehicles
vehicles required
required and
and maximize
maximize the
the
efficiency of the search. Oceanic, biologic, and geologic variability of underwater
efficiency of the search. Oceanic, biologic, and geologic variability of underwater environments environments
impact
impact inin the
the search
search performance
performance of of teams
teams ofof AUVs.
AUVs. ToTo address
address search
search planning
planning in
in these
these conditions,
conditions,
where
where the detection process is prone to false alarms, Baylog J. et al. [135] applied a game theoretic
the detection process is prone to false alarms, Baylog J. et al. [135] applied a game theoretic
approach
approach to to the
the optimization
optimization of of aa search
search channel
channel characterization
characterization of of the
the environment.
environment. The
The search
search
space is partitioned into discrete cells in which objects of interest may be found.
space is partitioned into discrete cells in which objects of interest may be found. The game theoryThe game theory
approach
approach seeks
seeks toto find
find the
the equilibrium
equilibrium solution
solution of
of the
the game
game rather
rather than
than the
the optimal
optimal solution
solution to
to aa
fixed objective of maximizing the value-over-cost. To demonstrate effectiveness in achieving the
game objective, a sequence of searches by four search agents over a search region was planned and
simulated. Li et al. [136] proposed a sub-region collaborative search strategy and a target searching
algorithm based on a perceptual adaptive dynamic prediction. The reality of the local environment
is obtained by using the FLS of the multi-AUV system. The simulation experiments verified that the
Appl. Sci. 2020, 10, 1256 26 of 37

fixed objective of maximizing the value-over-cost. To demonstrate effectiveness in achieving the game
objective, a sequence of searches by four search agents over a search region was planned and simulated.
Li et al. [136] proposed a sub-region collaborative search strategy and a target searching algorithm
based on a perceptual adaptive dynamic prediction. The reality of the local environment is obtained
by using the FLS of the multi-AUV system. The simulation experiments verified that the algorithm
proposed successfully searched and tracked the target. Moreover, in the case of an AUV failure, it can
also ensure that other AUVs cooperate to complete the remaining target search tasks. Algorithms for
collaborative search based on Neural Networks (NN) are being designed to overcome the variability of
the environment and the presence of obstacles. Iv et al. [137] presented a region search algorithm based
on a Glasius Bio-inspired Neural Network (GBNN), which can be used for AUVs to perform target
search tasks in underwater regions with obstacles. In this algorithm, the search area is divided into
several discrete sub-areas and connections are made between adjacent neurons. AUVs and obstacles
are introduced to the network as sources of excitation in order to avoid collisions during the search
process. By constructing hypothetical targets and introducing them into the NN as stimulating sources
of excitation, the AUVs are guided to quickly search for areas where the target is likely to exist and
they can efficiently complete the search task. Sun et al. [138] designed a new strategy for collaborative
search with a GBNN algorithm. In the algorithm, a grid map is set up to represent the working
environment and NN are constructed where each AUV corresponds to a NN. All the AUVs must
share information about the environment and, to avoid collision between the vehicles, each AUV is
treated as a moving obstacle in the region. Simulation was conducted in MATLAB to confirm that
through the proposed algorithm, multi-AUVs can plan reasonable and collision-free coverage path
and reach full coverage on the same task area with division of labor and cooperation. Yan et al. [139]
addressed a control problem for a group of AUVs tracking a moving target with varying velocity.
For this algorithm, at least one AUV is assumed to be capable of obtaining information about the
target, and the communication topology graph of the vehicle is assumed to be undirected connected.
Simulations were made using MATLAB to demonstrate the efficiency and effectiveness of the proposed
control algorithm, considering a system with three vehicles.

3.3.2. Intervention Missions


There is much more in the underwater environment for AUVs beyond survey missions.
Manipulating objects, repairing structures or pipes, recovering black-boxes, extracting samples,
among other tasks, make it necessary to have a platform with the capacity of autonomously navigate
and perform them, since nowadays these are mostly done by manned or remotely operated vehicles.
Researchers have worked in recent years in the design and development of such platforms, which
results difficult even for a single Intervention AUV (I-AUV) due to the complexity of the vehicle itself
plus the manipulator system. The Girona 500 I-AUV is an example of a single vehicle platform for
intervention missions. This vehicle is used to autonomously dock into an adapted subsea panel and
perform the intervention task of turning a valve and plugging in/unplugging a connector [140]. The
same vehicle was also equipped with a three-fingered gripper and an artificial vision system to locate
and recover a black-box [141]. Other projects, such as the Italian national project MARIS [142] have
been launched to produce theoretical, simulated and experimental results for intervention AUVs
either standalone or for collaborative teams. The aim of the MARIS project was the development of
technologies that allow the use of teams of AUVs for intervention missions, in particular: reliable
guidance and control, stereovision techniques for object recognition, reliable grasp, manipulation and
transportation of objects, coordination and control methods for large object grasp and transportation,
high-level mission planning techniques, underwater communication, and the design and realization of
prototype systems, allowing experimental demonstrations of integrating the results from the previous
objectives. The open-frame fully actuated robotic platform R2 ROV/AUV was used for the MARIS
project, and the Underwater Modular Manipulator (UMA); none of them developed within the project.
A vision system and a gripper were designed for the autonomous execution of manipulation tasks.
Appl. Sci. 2020, 10, 1256 27 of 37

Tests were performed to assess the correct integration of all the components, with a success rate
of a grasping operation of up 70% [143]. This project was an important achievement in terms of
autonomous underwater manipulation, and the theoretical studies for multi-vehicle localization and
collaborative underwater manipulation systems will be the next step to be demonstrated in field trials.
For collaborative I-AUVs, Simetti et al. [144,145] described a novel cooperative control policy for the
transportation of large objects in underwater environments using two manipulator vehicles. The
cooperative control algorithm takes into account all mission stages: grasping, transportation, and the
final positioning of the shared object by two vehicles. The cooperative transportation of the object
is carried out to deal with limitations of acoustic communication, this was achieved successfully by
exchanging just the tool frame velocities. A simulation was done, with the UwSim dynamic simulator,
using two vehicles of 6 degrees of freedom in order to test the control algorithms. The simulation
consisted of completing the following tasks: keeping away from joint limits, keeping the manipulability
measure above a certain threshold, maintaining the horizontal attitude of the vehicles, maintaining
a fixed distance between the vehicles, reaching the desired goal position. The system managed to
accomplish the final objective of the mission successfully, by transporting the object to the desired goal
position. Conti et al. [146] proposed an innovative decentralized approach for cooperative mobile
manipulation of intervention AUVs. The control architecture deals with the simultaneous control of
the vehicles and robotic arms, and the underwater localization. Simulations were made in MATLAB
Simulink to test the potential of the system. The cooperative mobile manipulation was performed by
four AUVs placed at the four corners of the object and obstacles were introduced as spheres. According
to authors, results were very encouraging because the AUV swarm keeps both, the formation during
the manipulation phase and the object, during an avoiding phase performed due to the presence of
obstacles. Cataldi et al. [147] worked in cooperative control of underwater vehicle-manipulator systems.
An architecture is proposed by authors which take into account most of the underwater constraints:
uncertainty in the model, low sensor bandwidth, position-only arm control, geometric-only object pose
estimation. The simulated system, designed in MATLAB and adapted with SimMechanics, consisted
of two AUVs transporting a bar. Results on bar position and applied forces on end effectors provided
promising results on its possible real applications. Heshmati-alamdari et al. [148] worked on a similar
system. Nonlinear model predictive control approach was proposed for a team of AUVs transporting
an object. The model has to deal with the coupled dynamics between the robots and the object. The
feedback relies only on each AUV local measurements to deal with communication issues, and no data
is exchanged between the robots. A real-time simulation, based on UwSim dynamic simulator and
running on the Robot Operating System (ROS), was performed to validate the proposed approach,
where the aim for the team of AUVs was to follow a set of pre-defined waypoints while avoiding
obstacles within the workspace which was successfully achieved.
A summary of collaborative AUVs missions is presented in Table 6, as well as the potential
applications and the strategies proposed in recent years.

Table 6. Summary on collaborative AUV missions.

Missions Applications Approaches Results


Collaborative Searching Game theory. Acoustic systems Simulation and
surveillance Tracking Experimental
Dynamic prediction theory.
Mapping
Glasius bio-inspired Active landmarks
Inspecting
neural networks. and cameras
Consensus dynamics.
Decentralized strategies
Collaborative Recovering
Minimal information exchange strategy Simulation
intervention Manipulating
Nonlinear model predictive control
Appl. Sci. 2020, 10, 1256 28 of 37

3.4. Collaborative AUVs Overview


Nature of underwater environments makes the use of communication systems with high-frequency
signals difficult. This due to the rapid attenuation that permits propagation only at very short distances.
Acoustic signals have a better performance, but face many challenges such as signal interferences and
small bandwidth, which results in the need for time synchronization methods and hence, produce a
high latency in the system. Another option is a light-based system, which offers a higher bandwidth
but at short/medium ranges. Blue/green light has better propagation in underwater than any other
light; but, when the range for communication is increased, the power consumption, weight, and
volume of the equipment required also increase.
If the inter-vehicle communication system is good enough in terms of range, bandwidth and rate,
range-only/single-beacon can be an effective method for target localization and collaborative navigation
of teams of AUVs. Vision-based systems are also an option that has the potential to control AUV
formations without the need of relying on inter-vehicle communications, but only if the environmental
conditions are favorable for light propagation and sensing.
Collaborative missions are quite difficult to implement in real conditions. Assembling a team
of AUVs with the proper technology to overcome localization, navigation, and communication
shortcomings results difficult for researchers who have to limit their proposals to numerical simulations.
Most of the authors use MATLAB Simulink to perform their simulations and some tools such as the
former SimMechanics (now called Simscape Multibody). Another simulation environment commonly
used for underwater robotics is the UnderWater Simulator (UWSim) [149]. With those tools, researches
are working in pushing the state-of-the-art in terms of control, localization, and navigation algorithms.
Within them, machine learning algorithms are gaining quite an attention. They are being employed in
different aspects such as navigation [150,151], obstacle avoidance and multi-AUVs formation control.

4. Conclusions
A review of different alternatives for underwater localization, communication, and navigation
of Autonomous Underwater Vehicles is addressed in this work. Although Section 2 discusses
single-vehicle localization and navigation, the aim of this work is to show that those technologies
are being applied to multi-vehicles systems, or can be implemented in the future. Every underwater
mission is different and has its own limitations and challenges. For that reason, it is not possible to
state which localization or navigation system has the best performance. For a long-range mission
(kilometers) an accuracy of tens or hundreds of meters from an inertial-based navigation system can be
acceptable, as the main characteristic wanted for it is the long-range capacity. In a small navigation
environment, i.e., a docking station or a laboratory tank, there is a need for much better accuracy to
avoid collisions with the tank’s walls. In that situation, an SBL/USBL system at an operation frequency
of 200 kHz can perform with errors in centimeters range or, if the water conditions are favorable, a
visual-based system can perform even better at a lower cost.
Current achievements in the field of collaborative AUVs have been also presented, including
communication, collaborative localization and navigation, surveillance, and intervention missions.
The use of a hybrid (acoustic and light-based) system is a promising option for the communication of
collaborative AUVs. The acoustic sub-system can handle the long-range communications meanwhile
the light-based takes care of the inter-vehicle communication where a high rate is critical for collision
avoidance. A hybrid system can be an interesting alternative also for collaborative navigation. Acoustic
methods can be implemented in a team of AUVs for medium/long-range navigation meanwhile a
visual-based method is used to maintain the formation and avoid collisions between the vehicles. In
terms of algorithms, machine learning seems to be one of the best approaches to achieve collaborative
navigation and to give a team the capacity to perform complex surveillance and intervention missions.
Relating to collaborative intervention missions, which mostly have been addressed with numerical
simulations, the next step is to test the algorithms in real experiments. Such experimentation can be
done in controlled conditions, such as a laboratory tank, where the vehicles would not have to deal
Appl. Sci. 2020, 10, 1256 29 of 37

with the changing conditions of the sea, so researches can focus on the collaborative task algorithms
such as the carrying of an object by two vehicles.

Author Contributions: Conceptualization, J.G.-G., A.G.-E., L.G.G.-V., and T.S.-J.; methodology, J.G.-G., A.G.-E,
E.C.-U., L.G.G.-V., and T.S.-J.; investigation, J.G.-G.; writing—original draft preparation, J.G.-G.; writing—review
and editing, all authors; visualization, J.G.-G.; supervision, A.G.-E., E.C.-U., L.G.G.-V., and T.S.-J.; funding
acquisition J.A.E.C.; project administration, J.G.-G., A.G.-E., L.G.G.-V., and T.S.-J. All authors have read and agreed
to the published version of the manuscript.
Funding: The authors would like to acknowledge the financial support of Tecnologico de Monterrey, in the
production of this work.
Acknowledgments: The authors would like to acknowledge support from CONACyT for PhD studies of first
author (scholarship number 741738).
Conflicts of Interest: The authors declare no conflict of interest.

Abbreviations
AFRB Autonomous Field Robotics Laboratory
AHRS Attitude and Heading Reference System
AUV Autonomous Underwater Vehicle
BITAN Beijing university of aeronautics and astronautics Inertial Terrain-Aided Navigation
BK Bandler and Kohout
CRNN Convolution Recurrent Neural Network
DR Dead-Reckoning
DSO Direct Sparse Odometry
DT Distance Traveled
DVL Doppler Velocity Logger
EKF Extended Kalman Filter
ELC Extended Loosely Coupled
FLS Forward-Looking SONAR
FTPS Fitting of Two Point Sets
GBNN Glasius Bio-inspired Neural Network
GN Geophysical Navigation
GPS Global Positioning System
HSV Hue Saturation Value
I-AUV Intervention AUV
IMU Inertial Measurement Unit
INS Inertial Navigation Systems
KF Kalman Filter
LBL Long Baseline
LC Loosely Coupled
LCI Language-Centered Intelligence
MEMS Micro-Electro-Mechanical System
NRT Near-Real-Time
NN Neural Networks
PF Particle Filter
PL-SLAM Point and Line SLAM
PMF Point Mass Filter
PS Pressure Sensor
PTAM Parallel Tracking And Mapping
RMSE Root-Mean-Square Error
RNN Recurrent Neural Network
ROS Robot Operating System
SBL Short Baseline
SINS Strapdown Inertial Navigation System
SITAN Sandia Inertial Terrain Aided Navigation
Appl. Sci. 2020, 10, 1256 30 of 37

SLAM Simultaneous Location And Mapping


SoG Sum of Gaussian
SONAR Sound Navigation And Ranging
SVO Semi-direct Visual Odometry
TAN Terrain-Aided Navigation
TBN Terrain-Based Navigation
TC Tightly Coupled
TERCOM TERrain COntour-Matching
TERPROM TERrain PROfile Matching
TRN Terrain-Referenced Navigation
UKF Unscented Kalman Filter
USBL Ultra-Short Baseline
USV Unmanned Surface Vehicle
VIO Visual Inertial Odometry
VO Visual Odometry

References
1. Petillo, S.; Schmidt, H. Exploiting adaptive and collaborative AUV autonomy for detection and
characterization of internal waves. IEEE J. Ocean. Eng. 2014, 39, 150–164. [CrossRef]
2. Massot-Campos, M.; Oliver-Codina, G. Optical sensors and methods for underwater 3D reconstruction.
Sensors 2015, 15, 31525–31557. [CrossRef] [PubMed]
3. Xu, J.; Chen, X.; Song, X.; Li, H. Target recognition and location based on binocular vision system of UUV.
Chin. Control Conf. CCC 2015, 2015, 3959–3963.
4. Ridao, P.; Carreras, M.; Ribas, D.; Sanz, P.J.; Oliver, G. Intervention AUVs: The next challenge. Annu. Rev.
Control 2015, 40, 227–241. [CrossRef]
5. Abreu, N.; Matos, A. Minehunting Mission Planning for Autonomous Underwater Systems Using
Evolutionary Algorithms. Unmanned Syst. 2014, 2, 323–349. [CrossRef]
6. Reed, S.; Wood, J.; Haworth, C. The detection and disposal of IED devices within harbor regions using AUVs,
smart ROVs and data processing/fusion technology. In Proceedings of the 2010 International WaterSide
Security Conference, Carrara, Italy, 3–5 November 2010; pp. 1–7.
7. Ren, Z.; Chen, L.; Zhang, H.; Wu, M. Research on geomagnetic-matching localization algorithm for unmanned
underwater vehicles. In Proceedings of the 2008 International Conference on Information and Automation,
Zhangjiajie, China, 20–23 June 2008; pp. 1025–1029.
8. Leonard, J.J.; Bennett, A.A.; Smith, C.M.; Feder, H.J.S. Autonomous Underwater Vehicle Navigation. MIT
Mar. Robot. Lab. Tech. Memo. 1998, 1, 1–17.
9. Rice, H.; Kelmenson, S.; Mendelsohn, L. Geophysical navigation technologies and applications. In Proceedings
of the PLANS 2004 Position Location and Navigation Symposium (IEEE Cat. No.04CH37556), Monterey, CA,
USA, 26–29 April 2004; pp. 618–624.
10. Fallon, M.F.; Kaess, M.; Johannsson, H.; Leonard, J.J. Efficient AUV navigation fusing acoustic ranging and
side-scan sonar. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation,
Shanghai, China, 9–13 May 2011; pp. 2398–2405.
11. Bosch, J.; Gracias, N.; Ridao, P.; Istenič, K.; Ribas, D. Close-range tracking of underwater vehicles using light
beacons. Sensors 2016, 16, 429. [CrossRef] [PubMed]
12. Nicosevici, T.; Garcia, R.; Carreras, M.; Villanueva, M. A review of sensor fusion techniques for underwater
vehicle navigation. In Proceedings of the Oceans ’04 MTS/IEEE Techno-Ocean ’04 (IEEE Cat. No.04CH37600),
Kobe, Japan, 9–12 November 2005; pp. 1600–1605.
13. Paull, L.; Saeedi, S.; Seto, M.; Li, H. AUV navigation and localization: A review. IEEE J. Ocean. Eng. 2014, 39,
131–149. [CrossRef]
14. Che, X.; Wells, I.; Dickers, G.; Kear, P.; Gong, X. Re-evaluation of RF electromagnetic communication in
underwater sensor networks. IEEE Commun. Mag. 2010, 48, 143–151. [CrossRef]
15. Li, Z.; Dosso, S.E.; Sun, D. Motion-compensated acoustic localization for underwater vehicles. IEEE J. Ocean.
Eng. 2016, 41, 840–851. [CrossRef]
Appl. Sci. 2020, 10, 1256 31 of 37

16. Zhang, J.; Han, Y.; Zheng, C.; Sun, D. Underwater target localization using long baseline positioning system.
Appl. Acoust. 2016, 111, 129–134. [CrossRef]
17. Han, Y.; Zheng, C.; Sun, D. Accurate underwater localization using LBL positioning system. In Proceedings
of the OCEANS 2015-MTS/IEEE Washington, Washington, DC, USA, 19–22 October 2016; pp. 1–4.
18. Zhai, Y.; Gong, Z.; Wang, L.; Zhang, R.; Luo, H. Study of underwater positioning based on short baseline
sonar system. Int. Conf. Artif. Intell. Comput. Intell. AICI 2009, 2, 343–346.
19. Smith, S.M.; Kronen, D. Experimental results of an inexpensive short baseline acoustic positioning system
for AUV navigation. Ocean. Conf. Rec. 1997, 1, 714–720.
20. Costanzi, R.; Monnini, N.; Ridolfi, A.; Allotta, B.; Caiti, A. On field experience on underwater acoustic
localization through USBL modems. In Proceedings of the OCEANS 2017, Aberdeen, UK, 19–22 June 2017;
pp. 1–5.
21. Morgado, M.; Oliveira, P.; Silvestre, C. Experimental evaluation of a USBL underwater positioning system.
In Proceedings of the ELMAR-2010, Zadar, Croatia, 15–17 September 2010; pp. 485–488.
22. Xu, Y.; Liu, W.; Ding, X.; Lv, P.; Feng, C.; He, B.; Yan, T. USBL positioning system based Adaptive Kalman
filter in AUV. In Proceedings of the 2018 OCEANS-MTS/IEEE Kobe Techno-Oceans (OTO), Kobe, Japan,
28–31 May 2018.
23. Reis, J.; Morgado, M.; Batista, P.; Oliveira, P.; Silvestre, C. Design and experimental validation of a USBL
underwater acoustic positioning system. Sensors 2016, 16, 1491. [CrossRef] [PubMed]
24. Petillot, Y.R.; Antonelli, G.; Casalino, G.; Ferreira, F. Underwater Robots: From Remotely Operated Vehicles
to Intervention-Autonomous Underwater Vehicles. IEEE Robot. Autom. Mag. 2019, 26, 94–101. [CrossRef]
25. Vaman, D. TRN history, trends and the unused potential. In Proceedings of the 2012 IEEE/AIAA 31st Digital
Avionics Systems Conference (DASC), Williamsburg, VA, USA, 14–18 October 2012; pp. 1–16.
26. Melo, J.; Matos, A. Survey on advances on terrain based navigation for autonomous underwater vehicles.
Ocean Eng. 2017, 139, 250–264. [CrossRef]
27. Jekeli, C. Gravity on Precise, Short-Term, 3-D Free- Inertial Navigation. J. Inst. Navig. 1997, 44, 347–357.
[CrossRef]
28. Perlmutter, M.; Robin, L. High-performance, low cost inertial MEMS: A market in motion! In Proceedings of
the 2012 IEEE/ION Position, Location and Navigation Symposium, Myrtle Beach, SC, USA, 23–26 April 2012;
pp. 225–229.
29. Jekeli, C. Precision free-inertial navigation with gravity compensation by an onboard gradiometer. J. Guid.
Control. Dyn. 2007, 30, 1214–1215. [CrossRef]
30. Rice, H.; Mendelsohn, L.; Aarons, R.; Mazzola, D. Next generation marine precision navigation system. In
Proceedings of the IEEE 2000 Position Location and Navigation Symposium (Cat. No.00CH37062), San Diego,
CA, USA, 13–16 March 2000; pp. 200–206.
31. ISM3D-Underwater AHRS Sensor-Impact Subsea. Available online: http://www.impactsubsea.co.uk/ism3d-2/
(accessed on 26 December 2019).
32. Underwater 9-axis IMU/AHRS sensor-Seascape Subsea BV. Available online: https://www.seascapesubsea.
com/product/underwater-9-axis-imu-ahrs-sensor/ (accessed on 26 December 2019).
33. Inertial Labs Attitude and Heading Reference Systems (AHRS)-Inertial Labs. Available online: https:
//inertiallabs.com/products/ahrs/ (accessed on 26 December 2019).
34. Elipse 2 series-Miniature Inertial Navigation Sensors. Available online: https://www.sbg-systems.com/
products/ellipse-2-series/#ellipse2-a_miniature-ahrs (accessed on 26 December 2019).
35. DSPRH-Depth and AHRS Sensor | TMI-Orion.Com. Available online: https://www.tmi-orion.com/en/
robotics/underwater-robotics/dsprh-depth-and-ahrs-sensor.htm (accessed on 26 December 2019).
36. VN-100-VectorNav Technologies. Available online: https://www.vectornav.com/products/vn-100 (accessed
on 21 January 2020).
37. XSENS-MTi 600-Series. Available online: https://www.xsens.com/products/mti-600-series (accessed on
21 January 2020).
38. Hurtos, N.; Ribas, D.; Cufi, X.; Petillot, Y.; Salvi, J. Fourier-based Registration for Robust Forward-looking
Sonar Mosaicing in Low-visibility Underwater Environments. J. Field Robot. 2014, 32, 123–151.
39. Galarza, C.; Masmitja, I.; Prat, J.; Gomariz, S. Design of obstacle detection and avoidance system for Guanay
II AUV. 24th Mediterr. Conf. Control Autom. MED 2016, 5, 410–414.
Appl. Sci. 2020, 10, 1256 32 of 37

40. Braginsky, B.; Guterman, H. Obstacle avoidance approaches for autonomous underwater vehicle: Simulation
and experimental results. IEEE J. Ocean. Eng. 2016, 41, 882–892. [CrossRef]
41. Lin, C.; Wang, H.; Yuan, J.; Yu, D.; Li, C. An improved recurrent neural network for unmanned underwater
vehicle online obstacle avoidance. Ocean Eng. 2019, 189, 106327. [CrossRef]
42. LBL Positioning Systems | EvoLogics. Available online: https://evologics.de/lbl#products (accessed on
26 December 2019).
43. GeoTag. Available online: http://www.sercel.com/products/Pages/GeoTag.aspx?gclid=
EAIaIQobChMI5sqPkY3S5gIV2v_jBx0apwTAEAAYAiAAEgILm_D_BwE (accessed on 26 December 2019).
44. MicroPAP Compact Acoustic Positioning System-Kongsberg Maritime. Available online:
https://www.kongsberg.com/maritime/products/Acoustics-Positioning-and-Communication/acoustic-
positioning-systems/pap-micropap-compact-acoustic-positioning-system/ (accessed on 26 December 2019).
45. Subsonus | Advanced Navigation. Available online: https://www.advancednavigation.com/product/
subsonus?gclid=EAIaIQobChMI5sqPkY3S5gIV2v_jBx0apwTAEAAYASAAEgJNpfD_BwE (accessed on
26 December 2019).
46. Underwater GPS-Water Linked AS. Available online: https://waterlinked.com/underwater-gps/ (accessed on
26 December 2019).
47. Batista, P.; Silvestre, C.; Oliveira, P. Tightly coupled long baseline/ultra-short baseline integrated navigation
system. Int. J. Syst. Sci. 2016, 47, 1837–1855. [CrossRef]
48. Vasilijević, A.; Nad, D.; Mandi, F.; Miškovi, N.; Vukić, Z. Coordinated navigation of surface and underwater
marine robotic vehicles for ocean sampling and environmental monitoring. IEEE/ASME Trans. Mechatron.
2017, 22, 1174–1184. [CrossRef]
49. Sarda, E.I.; Dhanak, M.R. Launch and Recovery of an Autonomous Underwater Vehicle from a Station-Keeping
Unmanned Surface Vehicle. IEEE J. Ocean. Eng. 2019, 44, 290–299. [CrossRef]
50. Masmitja, I.; Gomariz, S.; Del Rio, J.; Kieft, B.; O’Reilly, T. Range-only underwater target localization:
Path characterization. In Proceedings of the Oceans 2016 MTS/IEEE Monterey, Monterey, CA, USA,
19–23 September 2016; pp. 1–7.
51. Bayat, M.; Crasta, N.; Aguiar, A.P.; Pascoal, A.M. Range-Based Underwater Vehicle Localization in the
Presence of Unknown Ocean Currents: Theory and Experiments. IEEE Trans. Control Syst. Technol. 2016, 24,
122–139. [CrossRef]
52. Vallicrosa, G.; Ridao, P. Sum of Gaussian single beacon range-only localization for AUV homing. Annu. Rev.
Control 2016, 42, 177–187. [CrossRef]
53. Zhang, T.; Wang, Z.; Li, Y.; Tong, J. A passive acoustic positioning algorithm based on virtual long baseline
matrix window. J. Navig. 2019, 72, 193–206. [CrossRef]
54. Zhang, F.; Chen, X.; Sun, M.; Yan, M.; Yang, D. Simulation study of underwater passive navigation system
based on gravity gradient. Int. Geosci. Remote Sens. Symp. 2004, 5, 3111–3113.
55. Meduna, D.K. Terrain Relative Navigation for Sensor-Limited Systems with Applications to Underwater Vehicles;
Doctor of Philisophy, Standford University,: Stanford, CA, USA, 2011.
56. Wei, E.; Dong, C.; Yang, Y.; Tang, S.; Liu, J.; Gong, G.; Deng, Z. A Robust Solution of Integrated SITAN with
TERCOM Algorithm: Weight-Reducing Iteration Technique for Underwater Vehicles’ Gravity-Aided Inertial
Navigation System. Navig. J. Inst. Navig. 2017, 64, 111–122. [CrossRef]
57. Pei, Y.; Chen, Z.; Hung, J.C. BITAN-II: An improved terrain aided navigation algorithm. IECON Proc. Ind.
Electron. Conf. 1996, 3, 1675–1680.
58. Han, Y.; Wang, B.; Deng, Z.; Fu, M. A combined matching algorithm for underwater gravity-Aided navigation.
IEEE/ASME Trans. Mechatron. 2018, 23, 233–241. [CrossRef]
59. Guo, C.; Li, A.; Cai, H.; Yang, H. Algorithm for geomagnetic navigation and its validity evaluation. In
Proceedings of the 2011 IEEE International Conference on Computer Science and Automation Engineering,
Shanghai, China, 10–12 June 2011; pp. 573–577.
60. Zhao, J.; Wang, S.; Wang, A. Study on underwater navigation system based on geomagnetic match technique.
In Proceedings of the 2009 9th International Conference on Electronic Measurement & Instruments, Beijing,
China, 16–19 August 2009; pp. 3255–3259.
61. Menq, C.H.; Yau, H.; Lai, G. Automated precision measurement of surface profile in CAD-directed inspection.
IEEE Trans. Robot. Autom. 1992, 8, 268–278. [CrossRef]
Appl. Sci. 2020, 10, 1256 33 of 37

62. Wang, S.; Zhang, H.; Kun, Y.; Tian, C. Study on the underwater geomagnetic navigation based on the
integration of TERCOM and K-means clustering algorithm. In Proceedings of the OCEANS’10 IEEE SYDNEY,
Sydney, Australia, 24–27 May 2010; pp. 1–4.
63. Jaulin, L. Isobath following using an altimeter as a unique exteroceptive sensor. CEUR Workshop Proc. 2018,
2331, 105–110.
64. Cowie, M.; Wilkinson, N.; Powlesland, R. Latest Development of the TERPROM ®Digital Terrain System
(DTS). In Proceedings of the 2008 IEEE/ION Position, Location and Navigation Symposium, Monterey, CA,
USA, 5–8 May 2008; pp. 219–1229.
65. Zhao, L.; Gao, N.; Huang, B.; Wang, Q.; Zhou, J. A novel terrain-aided navigation algorithm combined with
the TERCOM algorithm and particle filter. IEEE Sens. J. 2015, 15, 1124–1131. [CrossRef]
66. Salavasidis, G.; Munafò, A.; Harris, C.A.; Prampart, T.; Templeton, R.; Smart, M.; Roper, D.T.; Pebody, M.;
McPhail, S.D.; Rogers, E.; et al. Terrain-aided navigation for long-endurance and deep-rated autonomous
underwater vehicles. J. Field Robot. 2019, 36, 447–474. [CrossRef]
67. Meduna, D.K.; Rock, S.M.; McEwen, R.S. Closed-loop terrain relative navigation for AUVs with non-inertial
grade navigation sensors. In Proceedings of the 2010 IEEE/OES Autonomous Underwater Vehicles, Monterey,
CA, USA, 1–3 September 2010.
68. Nootz, G.; Jarosz, E.; Dalgleish, F.R.; Hou, W. Quantification of optical turbulence in the ocean and its effects
on beam propagation. Appl. Opt. 2016, 55, 8813. [CrossRef]
69. Eren, F.; Pe’Eri, S.; Thein, M.W.; Rzhanov, Y.; Celikkol, B.; Swift, M.R. Position, orientation and velocity
detection of Unmanned Underwater Vehicles (UUVs) using an optical detector array. Sensors 2017, 17, 1741.
[CrossRef]
70. Zhong, L.; Li, D.; Lin, M.; Lin, R.; Yang, C. A fast binocular localisation method for AUV docking. Sensors
2019, 19, 1735. [CrossRef]
71. Liu, S.; Xu, H.; Lin, Y.; Gao, L. Visual navigation for recovering an AUV by another AUV in shallow water.
Sensors 2019, 19, 1889. [CrossRef] [PubMed]
72. Monroy-Anieva, J.A.; Rouviere, C.; Campos-Mercado, E.; Salgado-Jimenez, T.; Garcia-Valdovinos, L.G.
Modeling and control of a micro AUV: Objects follower approach. Sensors 2018, 18, 2574. [CrossRef]
[PubMed]
73. Prats, M.; Ribas, D.; Palomeras, N.; García, J.C.; Nannen, V.; Wirth, S.; Fernández, J.J.; Beltrán, J.P.; Campos, R.;
Ridao, P.; et al. Reconfigurable AUV for intervention missions: A case study on underwater object recovery.
Intell. Serv. Robot. 2012, 5, 19–31. [CrossRef]
74. Durrant-Whyte, H.; Bailey, T. Simultaneous localization and mapping: Part I. IEEE Robot. Autom. Mag. 2006,
13, 99–108. [CrossRef]
75. Lu, Y.; Song, D. Visual Navigation Using Heterogeneous Landmarks and Unsupervised Geometric Constraints.
IEEE Trans. Robot. 2015, 31, 736–749. [CrossRef]
76. Carrillo, H.; Dames, P.; Kumar, V.; Castellanos, J.A. Autonomous robotic exploration using occupancy grid
maps and graph SLAM based on Shannon and Rényi Entropy. In Proceedings of the 2015 IEEE International
Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 487–494.
77. Pizzoli, M.; Forster, C.; Scaramuzza, D. REMODE: Probabilistic, monocular dense reconstruction in real time.
In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong,
China, 31 May–7 June 2014; pp. 2609–2616.
78. Brand, C.; Schuster, M.J.; Hirschmüller, H.; Suppa, M. Stereo-vision based obstacle mapping for
indoor/outdoor SLAM. IEEE Int. Conf. Intell. Robot. Syst. 2014, 1846–1853.
79. Hernández, J.D.; Istenič, K.; Gracias, N.; Palomeras, N.; Campos, R.; Vidal, E.; García, R.; Carreras, M.
Autonomous underwater navigation and optical mapping in unknown natural environments. Sensors 2016,
16, 1174. [CrossRef] [PubMed]
80. Palomer, A.; Ridao, P.; Ribas, D. Multibeam 3D underwater SLAM with probabilistic registration. Sensors
2016, 16, 560. [CrossRef] [PubMed]
81. Roman, C.; Singh, H. Consistency based error evaluation for deep sea bathymetric mapping with robotic
vehicles. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation, Orlando,
FL, USA, 15–19 May 2006; pp. 3568–3574.
Appl. Sci. 2020, 10, 1256 34 of 37

82. Gomez-Ojeda, R.; Moreno, F.A.; Zuñiga-Noël, D.; Scaramuzza, D.; Gonzalez-Jimenez, J. PL-SLAM: A Stereo
SLAM System Through the Combination of Points and Line Segments. IEEE Trans. Robot. 2019, 35, 734–746.
[CrossRef]
83. Wang, R.; Wang, X.; Zhu, M.; Lin, Y. Application of a Real-Time Visualization Method of AUVs in Underwater
Visual Localization. Appl. Sci. 2019, 9, 1428. [CrossRef]
84. Ferrera, M.; Moras, J.; Trouvé-Peloux, P.; Creuze, V.; Dégez, D. The Aqualoc Dataset: Towards Real-Time
Underwater Localization from a Visual-Inertial-Pressure Acquisition System. arXiv 2018, arXiv:1809.07076.
85. Joshi, B.; Rahman, S.; Kalaitzakis, M.; Cain, B.; Johnson, J.; Xanthidis, M.; Karapetyan, N.; Hernandez, A.;
Li, A.Q.; Vitzilaios, N.; et al. Experimental Comparison of Open Source Visual-Inertial-Based State Estimation
Algorithms in the Underwater Domain; IEEE/RSJ International Conference on Intelligent Robots and Systems
(IROS): Macau, China, 2019.
86. Allotta, B.; Chisci, L.; Costanzi, R.; Fanelli, F.; Fantacci, C.; Meli, E.; Ridolfi, A.; Caiti, A.; Di Corato, F.;
Fenucci, D. A comparison between EKF-based and UKF-based navigation algorithms for AUVs localization.
In Proceedings of the OCEANS 2015-Genova, Genoa, Italy, 18–21 May 2015; pp. 1–5.
87. Li, W.; Zhang, L.; Sun, F.; Yang, L.; Chen, M.; Li, Y. Alignment calibration of IMU and Doppler sensors for
precision INS/DVL integrated navigation. Optik (Stuttg) 2015, 126, 3872–3876. [CrossRef]
88. Rossi, A.; Pasquali, M.; Pastore, M. Performance analysis of an inertial navigation algorithm with DVL
auto-calibration for underwater vehicle. In Proceedings of the 2014 DGON Inertial Sensors and Systems
(ISS), Karlsruhe, Germany, 16–17 September 2014; pp. 1–19.
89. Gao, W.; Li, J.; Zhou, G.; Li, Q. Adaptive Kalman filtering with recursive noise estimator for integrated
SINS/DVL systems. J. Navig. 2015, 68, 142–161. [CrossRef]
90. Liu, P.; Wang, B.; Deng, Z.; Fu, M. INS/DVL/PS Tightly Coupled Underwater Navigation Method with
Limited DVL Measurements. IEEE Sens. J. 2018, 18, 2994–3002. [CrossRef]
91. Tal, A.; Klein, I.; Katz, R. Inertial navigation system/doppler velocity log (INS/DVL) fusion with partial dvl
measurements. Sensors 2017, 17, 415. [CrossRef]
92. Zhang, T.; Shi, H.; Chen, L.; Li, Y.; Tong, J. AUV positioning method based on tightly coupled SINS/LBL for
underwater acoustic multipath propagation. Sensors 2016, 16, 357. [CrossRef]
93. Zhang, T.; Chen, L.; Li, Y. AUV underwater positioning algorithm based on interactive assistance of SINS
and LBL. Sensors 2016, 16, 42. [CrossRef]
94. Manzanilla, A.; Reyes, S.; Garcia, M.; Mercado, D.; Lozano, R. Autonomous navigation for unmanned
underwater vehicles: Real-time experiments using computer vision. IEEE Robot. Autom. Lett. 2019, 4,
1351–1356. [CrossRef]
95. Li, D.; Ji, D.; Liu, J.; Lin, Y. A Multi-Model EKF Integrated Navigation Algorithm for Deep Water AUV. Int. J.
Adv. Robot. Syst. 2016, 3. [CrossRef]
96. Chen, Y.; Zheng, D.; Miller, P.A.; Farrell, J.A. Underwater Inertial Navigation with Long Baseline Transceivers:
A Near-Real-Time Approach. IEEE Trans. Control Syst. Technol. 2016, 24, 240–251. [CrossRef]
97. Ferrera, M.; Creuze, V.; Moras, J.; Trouvé-Peloux, P. AQUALOC: An underwater dataset for
visual–inertial–pressure localization. Int. J. Rob. Res. 2019, 38, 1549–1559. [CrossRef]
98. Autonomous Fiel Robotic Laboratory-Datasets. Available online: https://afrl.cse.sc.edu/afrl/resources/
datasets/ (accessed on 30 December 2019).
99. Hwang, J.; Bose, N.; Fan, S. AUV Adaptive Sampling Methods: A Review. Appl. Sci. 2019, 9, 3145. [CrossRef]
100. Farr, N.; Bowen, A.; Ware, J.; Pontbriand, C.; Tivey, M. An integrated, underwater optical/acoustic
communications system. In Proceedings of the OCEANS’10 IEEE SYDNEY, Sydney, Australia,
24–27 May 2010.
101. Stojanovic, M.; Beaujean, P.-P.J. Acoustic Communication. In Springer Handbook of Ocean Engineering;
Dhanak, M.R., Xiros, N.I., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 359–386.
ISBN 978-3-319-16649-0.
102. 920 Series ATM-925-Acoustic Modems-Benthos. Available online: http://www.teledynemarine.com/920-
series-atm-925?ProductLineID=8 (accessed on 26 December 2019).
103. Micromodem: Acoustic Communications Group. Available online: https://acomms.whoi.edu/micro-modem
(accessed on 26 December 2019).
104. LinkQuest. Available online: http://www.link-quest.com/html/uwm1000.htm (accessed on 26 December 2019).
Appl. Sci. 2020, 10, 1256 35 of 37

105. 48/78 Devices | EvoLogics. Available online: https://evologics.de/acoustic-modem/48-78 (accessed on


26 December 2019).
106. Mats 3G, Underwater Acoustics-Sercel. Available online: http://www.sercel.com/products/Pages/mats3g.aspx
(accessed on 26 December 2019).
107. L3Harris | Acoustic Modem GPM300. Available online: https://www.l3oceania.com/mission-systems/
undersea-communications/acoustic-modem.aspx (accessed on 26 December 2019).
108. Micron Modem | Tritech | Outstanding Performance in Underwater Technology. Available online: https:
//www.tritech.co.uk/product/micron-data-modem (accessed on 26 December 2019).
109. M64 Acoustic Modem for Wireless Underwater Communication. Available online: https://bluerobotics.com/
store/comm-control-power/acoustic-modems/wl-11003-1/ (accessed on 26 December 2019).
110. Yan, Z.; Wang, L.; Wang, T.; Yang, Z.; Chen, T.; Xu, J. Polar cooperative navigation algorithm for
multi-unmanned underwater vehicles considering communication delays. Sensors 2018, 18, 1044. [CrossRef]
[PubMed]
111. Giodini, S.; Binnerts, B. Performance of acoustic communications for AUVs operating in the North Sea. In
Proceedings of the OCEANS 2016 MTS/IEEE Monterey, Monterey, CA, USA, 19–23 September 2016; pp. 1–6.
112. Yang, T.; Yu, S.; Yan, Y. Formation control of multiple underwater vehicles subject to communication faults
and uncertainties. Appl. Ocean Res. 2019, 82, 109–116. [CrossRef]
113. Abad, A.; DiLeo, N.; Fregene, K. Ieee Decentralized Model Predictive Control for UUV Collaborative
Missions. In Proceedings of the OCEANS 2017-Anchorage, Anchorage, AK, USA, 18–21 September 2017.
114. Hallin, N.J.; Horn, J.; Taheri, H.; O’Rourke, M.; Edwards, D. Message anticipation applied to collaborating
unmanned underwater vehicles. In Proceedings of the OCEANS’11 MTS/IEEE KONA, Waikoloa, HI, USA,
19–22 September 2017; pp. 1–10.
115. Beidler, G.; Bean, T.; Merrill, K.; O’Rourke, M.; Edwards, D. From language to code: Implementing AUVish.
In Proceedings of the UUST07, Kos, Greece, 7–9 May 2007; pp. 19–22.
116. Rajala, A.; O’Rourke, M.; Edwards, D.B. AUVish: An application-based language for cooperating AUVs. In
Proceedings of the OCEANS 2006, Boston, MA, USA, 18–21 September 2006.
117. Potter, J.; Alves, J.; Green, D.; Zappa, G.; Nissen, I.; McCoy, K. The JANUS underwater communications
standard. Underw. Commun. Networking, UComms 2014, 1, 1–4.
118. Petroccia, R.; Alves, J.; Zappa, G. Fostering the use of JANUS in operationally-relevant underwater
applications. In Proceedings of the 2016 IEEE Third Underwater Communications and Networking
Conference (UComms), Lerici, Italy, 30 August–1 September 2016; pp. 1–5.
119. Alves, J.; Furfaro, T.; Lepage, K.; Munafo, A.; Pelekanakis, K.; Petroccia, R.; Zappa, G. Moving JANUS
forward: A look into the future of underwater communications interoperability. In Proceedings of the
OCEANS 2016 MTS/IEEE Monterey, Monterey, CA, USA, 19–23 September 2016; pp. 1–6.
120. Petroccia, R.; Alves, J.; Zappa, G. JANUS-based services for operationally relevant underwater applications.
IEEE J. Ocean. Eng. 2017, 42, 994–1006. [CrossRef]
121. McCoy, K.; Djapic, V.; Ouimet, M. JANUS: Lingua Franca. In Proceedings of the OCEANS 2016 MTS/IEEE
Monterey, Monterey, CA, USA, 19–23 September 2016; pp. 1–4.
122. Wiener, T.F.; Karp, S. The Role of Blue/Green Laser Systems in Strategic Submarine Communications. IEEE
Trans. Commun. 1980, 28, 1602–1607. [CrossRef]
123. Puschell, J.J.; Giannaris, R.J.; Stotts, L. The Autonomous Data Optical Relay Experiment: First two way laser
communication between an aircraft and submarine. In Proceedings of the National Telesystems Conference,
NTC IEEE 1992, Washington, DC, USA, 19–20 May 1992.
124. Enqi, Z.; Hongyuan, W. Research on spatial spreading effect of blue-green laser propagation through seawater
and atmosphere. In Proceedings of the 2009 International Conference on E-Business and Information System
Security, Wuhan, China, 23–24 May 2009; pp. 1–4.
125. Sangeetha, R.S.; Awasthi, R.L.; Santhanakrishnan, T. Design and analysis of a laser communication link
between an underwater body and an air platform. In Proceedings of the 2016 International Conference on
Next Generation Intelligent Systems (ICNGIS), Kottayam, India, 1–3 September 2017; pp. 1–5.
126. Cossu, G.; Corsini, R.; Khalid, A.M.; Balestrino, S.; Coppelli, A.; Caiti, A.; Ciaramella, E. Experimental
demonstration of high speed underwater visible light communications. In Proceedings of the 2013
2nd International Workshop on Optical Wireless Communications (IWOW), Newcastle upon Tyne, UK,
21–21 October 2013; pp. 11–15.
Appl. Sci. 2020, 10, 1256 36 of 37

127. Luqi, L. Utilization and risk of undersea communications. In Proceedings of the Ocean MTS/IEEE Monterey,
Monterey, CA, USA, 19–23 September 2016; pp. 1–7.
128. Yan, Z.; Yang, Z.; Yue, L.; Wang, L.; Jia, H.; Zhou, J. Discrete-time coordinated control of leader-following
multiple AUVs under switching topologies and communication delays. Ocean Eng. 2019, 172, 361–372.
[CrossRef]
129. Lichuan, Z.; Jingxiang, F.; Tonghao, W.; Jian, G.; Ru, Z. A new algorithm for collaborative navigation without
time synchronization of multi-UUVS. Ocean. Aberdeen 2017, 2017, 1–6.
130. Yan, Z.; Xu, D.; Chen, T.; Zhang, W.; Liu, Y. Leader-follower formation control of UUVs with model
uncertainties, current disturbances, and unstable communication. Sensors 2018, 18, 662. [CrossRef]
131. Cui, J.; Zhao, L.; Ma, Y.; Yu, J. Adaptive consensus tracking control for multiple autonomous underwater
vehicles with uncertain parameters. ICIC Express Lett. 2019, 13, 191–200.
132. Teck, T.Y.; Chitre, M.; Hover, F.S. Collaborative bathymetry-based localization of a team of autonomous
underwater vehicles. Proc. IEEE Int. Conf. Robot. Autom. 2014, 2475–2481.
133. Tan, Y.T.; Gao, R.; Chitre, M. Cooperative path planning for range-only localization using a single moving
beacon. IEEE J. Ocean. Eng. 2014, 39, 371–385. [CrossRef]
134. De Palma, D.; Indiveri, G.; Parlangeli, G. Multi-vehicle relative localization based on single range
measurements. IFAC-PapersOnLine 2015, 28, 17–22. [CrossRef]
135. Baylog, J.G.; Wettergren, T.A. A ROC-Based approach for developing optimal strategies in UUV search
planning. IEEE J. Ocean. Eng. 2018, 43, 843–855. [CrossRef]
136. Li, J.; Zhang, J.; Zhang, G.; Zhang, B. An adaptive prediction target search algorithm for multi-AUVs in an
unknown 3D environment. Sensors 2018, 18, 3853. [CrossRef] [PubMed]
137. Lv, S.; Zhu, Y. A Multi-AUV Searching Algorithm Based on Neuron Network with Obstacle. Int. Symp.
Auton. Syst. 2019, 131–136.
138. Bing Sun, D.Z. Complete Coverage Autonomous Underwater Vehicles Path Planning Based on Glasius
Bio-Inspired Neural Network Algorithm for Discrete and Centralized Programming. IEEE Trans. Cogn. Dev.
Syst. 2019, 73–84.
139. Yan, Z.; Liu, X.; Zhou, J.; Wu, D. Coordinated Target Tracking Strategy for Multiple Unmanned Underwater
Vehicles with Time Delays. IEEE Access 2018, 6, 10348–10357. [CrossRef]
140. Palomeras, N.; Peñalver, A.; Massot-Campos, M.; Negre, P.L.; Fernández, J.J.; Ridao, P.; Sanz, P.J.;
Oliver-Codina, G. I-AUV docking and panel intervention at sea. Sensors 2016, 16, 1673. [CrossRef]
141. Ribas, D.; Ridao, P.; Turetta, A.; Melchiorri, C.; Palli, G.; Fernandez, J.J.; Sanz, P.J. I-AUV Mechatronics
Integration for the TRIDENT FP7 Project. IEEE/ASME Trans. Mechatron. 2015, 20, 2583–2592. [CrossRef]
142. Casalino, G.; Caccia, M.; Caselli, S.; Melchiorri, C.; Antonelli, G.; Caiti, A.; Indiveri, G.; Cannata, G.; Simetti, E.;
Torelli, S.; et al. Underwater intervention robotics: An outline of the Italian national project Maris. Mar.
Technol. Soc. J. 2016, 50, 98–107. [CrossRef]
143. Simetti, E.; Wanderlingh, F.; Torelli, S.; Bibuli, M.; Odetti, A.; Bruzzone, G.; Rizzini, D.L.; Aleotti, J.; Palli, G.;
Moriello, L.; et al. Autonomous Underwater Intervention: Experimental Results of the MARIS Project. IEEE
J. Ocean. Eng. 2018, 43, 620–639. [CrossRef]
144. Simetti, E.; Casalino, G.; Manerikar, N.; Sperinde, A.; Torelli, S.; Wanderlingh, F. Cooperation between
autonomous underwater vehicle manipulations systems with minimal information exchange. In Proceedings
of the OCEANS 2015-Genova, Genoa, Italy, 18–21 May 2015.
145. Simetti, E.; Casalino, G. Manipulation and Transportation with Cooperative Underwater Vehicle Manipulator
Systems. IEEE J. Ocean. Eng. 2017, 42, 782–799. [CrossRef]
146. Conti, R.; Meli, E.; Ridolfi, A.; Allotta, B. An innovative decentralized strategy for I-AUVs cooperative
manipulation tasks. Rob. Auton. Syst. 2015, 72, 261–276. [CrossRef]
147. Cataldi, E.; Chiaverini, S.; Antonelli, G. Cooperative Object Transportation by Two Underwater
Vehicle-Manipulator Systems. In Proceedings of the 2018 26th Mediterranean Conference on Control
and Automation (MED), Zadar, Croatia, 19–22 June 2018; pp. 161–166.
148. Heshmati-alamdari, S.; Karras, G.C.; Kyriakopoulos, K.J. A Distributed Predictive Control Approach for
Cooperative Manipulation of Multiple Underwater Vehicle Manipulator Systems. In Proceedings of the
2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019;
pp. 4626–4632.
Appl. Sci. 2020, 10, 1256 37 of 37

149. Prats, M.; Perez, J.; Fernandez, J.J.; Sanz, P.J. An open source tool for simulation and supervision of underwater
intervention missions. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots
and Systems, Vilamoura, Portugal, 7–12 October 2012; pp. 2577–2582.
150. Hernández-Alvarado, R.; García-Valdovinos, L.G.; Salgado-Jiménez, T.; Gómez-Espinosa, A.;
Fonseca-Navarro, F. Neural network-based self-tuning PID control for underwater vehicles. Sensors
2016, 16, 1429. [CrossRef] [PubMed]
151. García-Valdovinos, L.G.; Fonseca-Navarro, F.; Aizpuru-Zinkunegi, J.; Salgado-Jiménez, T.;
Gómez-Espinosa, A.; Cruz-Ledesma, J.A. Neuro-Sliding Control for Underwater ROV’s Subject to Unknown
Disturbances. Sensors 2019, 19, 2943. [CrossRef] [PubMed]

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like