100% found this document useful (1 vote)
190 views3 pages

GPS Denied Navigation

GPS and other satellite navigation systems can be unreliable or unavailable in some environments due to signal interference, obstruction, or indoor use. As a result, GPS-denied navigation systems use various sensors to estimate motion and position. These include introceptive sensors like gyroscopes and IMU that detect rotation and acceleration relative to the device but are prone to drift. Exteroceptive sensors like radar, sonar, cameras, and lidar detect features in the surrounding environment to mitigate introceptive sensor drift and estimate motion over time. Popular GPS-denied navigation techniques include radar and sonar beacons, visual odometry from cameras, lidar for terrain mapping, and simultaneous localization and mapping (SLAM) algorithms.

Uploaded by

Mj Viknesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
190 views3 pages

GPS Denied Navigation

GPS and other satellite navigation systems can be unreliable or unavailable in some environments due to signal interference, obstruction, or indoor use. As a result, GPS-denied navigation systems use various sensors to estimate motion and position. These include introceptive sensors like gyroscopes and IMU that detect rotation and acceleration relative to the device but are prone to drift. Exteroceptive sensors like radar, sonar, cameras, and lidar detect features in the surrounding environment to mitigate introceptive sensor drift and estimate motion over time. Popular GPS-denied navigation techniques include radar and sonar beacons, visual odometry from cameras, lidar for terrain mapping, and simultaneous localization and mapping (SLAM) algorithms.

Uploaded by

Mj Viknesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

GPS Denied Navigation

Nowadays UAVs are provided with highly accurate absolute position information with the advancements in GPS, GLONASS and Galileo.
Reliance on such solutions which are although readily available and accurate, they can be problematic as they can be spoofed or jammed. And
navigation through areas with hindered view of the sky or indoor environments leads to GPS signal loss. All the navigation systems were GPS
denied prior to the introduction of global positioning system. These navigation systems use a variety of sensors to estimate the motion.
Introceptive sensor estimate the rotation and acceleration of the device relative to itself like gyroscope and IMU. These sensors are subject to
significant drift in sensor. Most of the GPS denied navigation solutions use exteroceptive sensors, in order to mitigate the drift from introceptive
sensors.
Exteroceptive sensors sense the surrounding environment for special features like, radio beacons [33-37], sonar [7,38-42], cameras [22-28], radar
[13,15,19,43,44] and laser range finders [29-32]. The relative measurements like heading angle, range are estimated by these sensors. Motion is
estimated as the measurements change over time. And some exteroceptive sensors measure the environment like pitot tubes and magento-meters.
While each sensor has its own limitations and strength; they provide value when comparing with various dead reckoning approaches. The figure
below shows the navigational architecture for a GPS denied environment

Radar beacons
A number of navigation system use radar to communicate with beacons, which are used as landmarks distributed throughout the navigation
environment [21, 59-62]. The radio signals received by the beacon are decoded and converted a range measurement like how its done on a global
scale GPS. They usually have low update rates and also are very accurate.
Most of the research is conducted in areas of range only model [37,63,64] which are very similar to the ones used in the single antenna radar
systems. Inherent feature description in comparison with other radar systems is provide by the beacon, which is unavailable for the other non
beacon radar system. Their dependency on a active distributed beacon systems limits their scope for widespread use for a GPS denied
navigational solution.
SONAR
SOund Navigation And Ranging (Sonar) is a techniques that uses sounds propagation to detect the range, Ie., transmits sound waves and recives
echoes back from the environmental features. Modern sonar system is based on the navigation of the Unmanned underwater vehicles (UUVs),
though these the application for airborne sensing is explored [65].
Many SONAR navigation techniques have been devised. The most common techniques are to the similar the Radar beacon, which uses sonar
transceivers. For terrain based navigation Synthetic aperture sonar (SAS) approaches are being used [66]. Another approach utilizes performing
sonar feature identification and tracking [40].

Light Detection and Ranging Devices (LiDAR)


Light Detection and Ranging Device (LiDAR) or laser range finder estimates the range based on the reflection from the environment through the
transmitted optical, laser or infrared beam. Range finders are usually used in combination with visual odometry., which are now commonly
utilized for GPS-denied applications.
The use of LiDAR is UAVs is explored in the literature [32,57,58], and it is commonly referred to as Terrain reference navigation. Terrain
reference navigation computes the vehicles terrain and compares it previously acquired accurate digital elevation model. LiDARs range is
limited and operate for farther ranges (>100m). They can operate at night unlike cameras, but are affected by poor weather.
Vision Based navigation
Vision based navigational systems use the images taken by a digital camera to determine the altitude and the position of the UAV. The process of
extracting the position and altitude information from the digital camera is computationally intensive, so the onboard computers on the UAV will
not be able to generate vision based solution at high rate. Thus the navigation system cannot solely rely on the vision based navigation system;
vision based navigation system is coupled with a dead reckoning system or INS. High rate altitude and position solution is acquired when both
the systems are acquired. The vision based navigation solution provides continuous and regular updates to mitigate the drift from the dead
reckoning system or INS.
Optical Sensors and Visual Odometry
Optical sensors use one or more cameras to estimate the relative orientation of features representing a location
with distinct change in the lighting from high to low; represented in the image by the pixel, which is converted by
relating the pixel to the focal length of the camera to give elevation angle and azimuth angle. Stereo vision is when
a second camera is used to estimate the feature range, but the distance between the cameras are required to be
smaller than the range of the feature [45,46]. Also range can be estimated by a technique called bundle adjustment
which uses a sequence of cameras. In both the cases as the distance to the feature increases the erroneous
estimate, due to the fixed resolution.
For more than a century, resolving the relative motion from two consecutive images has the topic of wide research
and it is commonly referred as structure from motion (SFM). Most of the research on SFM focuses on the
reconstruction of 3D environment from multiple images. Another area of study is visual odometry (VO) which
estimates the motion using the difference in perspective from consecutive images, as observed in imagery [22-24].
From each image, motion can be estimated from by identifying the corresponding landmarks and features resulting
in change in the landmark orientation. The camera configuration greatly influences the specific motion estimation
technique.
Visual odometry approaches are readily available and cheap; have shown to provide successful navigation solution.
One of the main limitation of this techniques is that the pixel covers increasingly large area as the range to the
feature increases. Thus, hindering the ability to track and detect objects. They also tend to operate very poorly in
bad lighting or poor weather.
Mapping
There a variety of GPS denied algorithms that improve consistency of odometric navigational solution, they create a
catalog of observed or known features, commonly referred as maps. These maps even after the sensor leaves the
field of view, it holds the feature estimate with hopes of that it will be re-observed. When a previously observed
feature is observed, the mapping algorithm performs an additional estimation step over the entire state model.
SLAM
The framework for SLAM was developed by Smith and Cheeseman, and Durrant Whyte. The foundation of SLAM is
based on observed landmark to estimate the geometric uncertainty. The works later on develop a technique to
denote the spatial relationship of the known landmarks into a stochastic map [100]; state space model denoting the
vehicle landmark positions and pose. Csorba presented that linking localization and mapping into single estimation
problem resulted in a convergent solution that strengthen the argument for SLAM. A convergent probabilistic
approach was developed by Thrun [65]. In both the approaches the convergence depends on the loop closure.
Much of the original work on SLAM was mostly sensor agnostic and model-based. Thruns model assumed a sensor
that was able to measure the type of landmark and approximate distance and relative angle from vehicles and to
the landmark [65]. Using Sonar Leonard and Durrant wyte tested their Extended kalman filter localization
technique. And csorba used laserfinders.
Multiple approaches to SLAM exist. EKF-SLAM, the most popular implementation, mod- els noise as Gaussian and
uses a extended Kalman filter (EKF) to estimate the map and vehicle state. FastSLAM uses a Rao-Blackwellised
particle filter, which more readily handles nonlinear process models [102]. GraphSLAM is similar to EKF-SLAM, but
composes the state matrix in an information-state form that better handles the large state models present in large
maps [103].

SLAM is considered as a successful navigation solution, the algorithm is often overly confident in its estimate, often
trusting inaccurate estimates. The state model is becoming significantly large with its increasing number of
features. Most of current SLAM research areas focuses on simplifying, or pruning, the feature space. [6,7] can be
referred for better understanding.
View-Based Maps
On of the notable SLAM approaches is techniques is View based mapping which is specifically developed for the use
with optical sensors [105,106]. The view based map algorithm is based on VO, which is also implemented in many
techniques developed for FrameSLAM [26]. It uses bag of words approach to describe the visual features and for
speed improvement it adds a vocabulary tree. Additional, a trimmed constraint graph, skeleton graph, consecutive
image frames is implemented. The implementation of skeleton graph improves the performance over long
maneuvers by decreasing the number of nodes on the graph than a typical SLAM graph.

You might also like