Global Positioning System (GPS) Based Location Finder On Android
Global Positioning System (GPS) Based Location Finder On Android
Emphasis on Telecommunications
May 2015
_________________________________________________
Blekinge Institute of Technology
Department of Applied Signal Processing
Supervisor: Muhammad Shahid
Examiner: Sven Johansson
Abstract
This thesis presents the development of an android application which has the capability of using
the concepts of augmented reality to submerge the virtual information of user’s surroundings by
detecting and tracking user’s location in real time.
Eclipse is an open source software, used in professional development of software solutions and
programming applications. It provides extensive availability of free libraries. It has been employed
for the development of software used in this thesis.
As the android GPS is notified, the application is fully location aware which keeps the track of
user’s location. When the user points the camera in a specific direction, the application
tracks the camera orientation and displays the records of a specific place. Then the
application keeps on updating the information as the direction changes.
The additional information is displayed with the help of "Google" databases. The information
when gathered is then displayed to the live feed of camera which helps the users to interact in a
more reliable way. Option for viewing the places in map view with the help of Google
Maps is also available.
I am much thankful to my parents, family members and friends for their continuous support and
help.
TABLE OF CONTENTS
Chapter-2 Background………..…................................................................................................4
3.2 Background..............................................................................................................................6
5.5 Environment………………………………………………………………………..…...…..18
5.6 Interface……………………………………………………………...……………...............18
5.8 APIs…………………………………………………………………………………………18
Chapter-6 Methodology………………………………………………………………………...19
Chapter-7 Implementation………………………………………………………………..........20
Chapter-9 References.............…………………………………………........………..................29
Chapter 1
Introduction
“Global Positioning System Based Location Finder on Android "is a smart phone application
that uses location based information and concepts of augmented reality to enhance user’s
experience. Using android GPS, the application is location aware. It keeps track of the user
location in real time. Whenever user points the phone in any direction, application updates
its camera view by displaying the label tags with additional information of the buildings
and places in that respective direction. The application keeps on updating the view as the
user changes location or direction. Information for different places is extracted from the
“Google” database. All the information of the user’s location is augmented on to the user’s
reality in this case the live camera feed.
As the android GPS is notified, the application is fully location aware which keeps the track of
user’s location. When the user points the camera in a specific direction, the application
tracks the camera orientation and displays the records of a specific place. Then the
application keeps on updating the information as the direction changes. The additional
information is displayed with the help of "Google" databases. The information when gathered
is then displayed to the live feed of camera which helps the users to interact in a more reliable
way. Option for viewing the places in map view with the help of Google Maps is also
available.
1
Schools, Colleges & Universities
Airports
Railway Stations
Police Stations
Petrol Pumps
Entertainment areas
o Operating System
The platform used for development of this project was Windows 7
Operating system.
o Programming Language
Java was used as development language.
o Environment
Eclipse (HELIOS)
o GUI
The GUI was designed in eclipse using XML
o Other Tools
Java JDK (Java Development Kit)
Android SDK (Software Development Kit)
o APIs
Google Maps API has also been used in the development of this application.
o Android Version
Minimum requirement for android version of our application is v. 2.3.4.
2
1.4 Features and functionality of the proposed system
The proposed application requires the use of an android Smartphone having version 2.3.4 or
above. Whenever user points the phone in any direction, application updates its camera view
by displaying the label tags with additional information of the buildings and places in that
respective direction. The basic functionality of this application is to fetch location from GPS
and send current location to some database. Then it receives the information of surrounding
places from database, and use device compass and accelerometer to set the device orientation.
This information is then fed into an algorithm that maps these coordinates on the screen.
3
Chapter 2
Background
2.1 Android
There is a lot of advancement in the OS (Operating Systems) since the last few decades.
When we look at the early stages of the cell phones, the black and white phones were
came into focus but with the passage of time, cell phones were pushed onto the next
level and now smart phones are particularly in focus. Mobile OS has come into horizon.
The operating systems for smart phones & tablets, there was a need of change, so after the year
2000 the OS like Android and Blackberry were urbanized. With the passage of time, among
all the greatest and widely used mobile operating systems there was android. Android Inc.
was founded in Palo Alto of California, U.S. by Andy Rubin, Rich miner, and Chris White
in early 2000. Then later on in 2005 Android was picked up by Google. With the advancement
in Android OS, numerous versions were introduced. After that there have been a number of
updates in the original version of Android [1]. Android Operating System gives the flexibility
for both (users & developers).
Android is not constraint to any maker, so it allows us to develop and change the
existence of original and makes us more innovative in the development process.
Rapid improvement
As the developer of Android, we have the opportunity to take the advancement to the
next level. Advancement can be in any form like the OLED display in the Android cell
phones.
Many mobile applications can be developed in any language like JAVA, Perl etc.
License Free
The license expenses does matter a lot, which is not noticeable in the Android
development because of its free sourcing advantage.
4
Ease in Application Development
The emulator plays an important role in the testing process of any application
made for android with the help of Android SDK (Software Development Kit).
Popularity
With more advancement in android, it has become the most popular OS of all
times as compared to others.
When talking about the iOS, the developers have to deal with fewer complications
in the developing process because of the fewer versions. The same is not the case
with Android applications. There are different brands in the market offering different
screen sizes and using various kinds of processors. So this makes a great deal of
opportunity for the android developers to innovate something new which can be nearly
compared to as a difficult task.
Performance Consideration
Every device has its own way of handling itself. Different brands, offering different
devices with various screen sizes and using various kinds of processors results
in performance issues.
When talking about android, we automatically came to know that it is a lot in the handled
by Google. Google makes Android a bit more terms independent which makes it much
better.
5
Chapter 3
Augmented Reality
AR (Augmented Reality) is one of the most interesting topics of the virtual reality area. There
is a big difference between the real and the virtual reality and when these two realities merge
together at a point, it makes a mixed reality. The information which is around us plans us to
have a great deal for the users. The Augmented reality is like a grouping of the real scene viewed
by the user and a virtual scene created by the computer that augments the scene with extra
information. Extra information may include any tags, pictures, historical events etc.
Augmented Reality enhances the perception of the user, by making a better presentation of the
nearby objects. The main objective that Augmented Reality has to accomplish is that not to
make any difference between the real and virtual environment [3]. It should be in such a way
that users feel like it is actually a real scene/environment. With this, the virtual images are
merged with the real view to create the augmented display.
Not the proper implementation puts the user in a dark scene for not getting the required
information. The system generated virtual objects in form of graphics must be correctly
registered with the real world in all dimensions. These objects are meant to be maintained while
the user moves or changes the direction from place to place.
Now, the real and the virtual environment are the two way different ends. When these two
points are meant to mix with each other at a point, it normally makes a mixed Reality.
Augmented Reality stays with the Reality end just to ensure that the reality end of the line with
the preponderate insight being the real world augmented by computer generated data.
6
3.2 Background
Augmented reality was originated and has progressed with virtual reality since the
commencement of 1950, but it got advancements in the last decade of 19th century [4].
The CAD software of Aircraft assembly design, simulation, navigation, military has used this
technology for many years. Installation and maintenance kind of complex tasks can be
simplified help in product prototypes and training can be held without being manufactured.
While augmented reality technology in itself has proved very useful in daily basis. Especially
their use in marketing than ads placed in traditional 2 -D, not only is appealing as extensive
material, resulting in an interactive, cool, savvy, and given the initial novelty - high viral
potential. Consumers love, clever marketing and respond positively to products that will be
remembered.
AR application depends on the ability of the audience. Audience through smart phones and to
limit the number of assembly, and to people interested in downloading the software.
What is certain is that the population is growing Smartphone, and it is also on the level of
processing power. More and more customers, phones, and more receptive to match future
regulation can display augmented reality, and once the software is downloaded and scan them
is their first principles - driven by curiosity.
The resulting expansion even more attractive and creative content , users actually a new and
fun twist on the usual marketing and service as go through augmented reality .
Furthermore, it can also be used to print a label and optional settings to keep them in front of a
webcam connected to a computer. To view the image of the brand and the bottom of the webcam
on the screen, so that consumers to place the cursor in places like the forehead (a mask), or
move the cursor one character in a game with control.
7
3.4 How does it work?
With a mobile app, the identification of a cell phone camera, and interprets a label, usually a
barcode image in black and white. The software analyzes the cursor in a virtual image overlay
to create on the phone screen in the camera position. This means that the application works with
the camera angle and distance phone has to be interpreted by the cursor away.
It must render the image because of the large number of calculations by phone to make an
image or model of the cursor, smartphones are often only able to support augmented reality
successfully. Phones need a camera, and data in RA are not in the app, saved good 3G
internet [5].
Projection
This is a type of AR, which is most commonly used in the line scrimmage or
the total yardage needed to be shown like a virtual view of the field in a real
environment. It is a type of augmented reality, which uses virtual imagery to augment
what you see live.
Marker Based
In this type of AR, it recognizes the shapes, faces or other objects in the real environment
to provide users with greater deal of information. Some smart phones has the ability to
recognize and read the bar codes of a product and with that bar code, additional
information like the reviews and the price of the product can also be shown.
Location Based
This variety uses the GPS for location information. This service provides the facility of
gathering the in close proximity and return additional information to the users. An
example can be like a user can use a smart phone with GPS to determine their respective
location, and then they can have some onscreen indication s augmented over a live
image and guide them towards their correct destination.
Outline
Outline is a type of AR that combines the outline of one’s body or a part of body with
virtual objects, this allows the users to mix the virtual objects to the real environment
allowing them to pick up any object or manipulate objects that do not exist in reality.
8
3.6 Applications of Augmented Reality
Applications for augmented reality are wide ranging. They are included in different areas such
as:
Direction
Direction finding applications or also known as navigation are mostly the natural
fit of augmented reality. Enhanced system like GPS is being used to make it easier for
the user to get from once point to another.
Sightseeing
There are a various applications for augmented reality in the sightseeing and for
tourism industries. The sightseeing application has been made more enthralling
with the use of augmented reality. A smart phone equipped with a camera, vacationers
can walk through any historic events and see the facts or figures overlaid on their live
camera screen.
Military
Medical
Gaming
Entertainment
With the reviews on Augmented Reality, the marker based augmented reality is
a great source of entertainment now days. This augmented reality category allows
9
the users to interact with certain objects on your computer screen in 3D. Other
applications may include Maintenance and Repair, Advertising and Promotion etc.
Camera Data
Displaying the live feed from the Android camera is the reality in augmented
reality. The camera data is available by using the APIs available within the
android hardware Camera package.
If your application doesn’t need to analyze frame data , then starting a preview in the
normal way by using a SurfaceHolder object with the setPreviewDisplay() method
is appropriate. With this method, you’ll be able to display what the camera is recording
on the screen for use. However, if your application does need the frame data, it’s
available by calling the setPreviewCallback() method with a valid
Camera.PreviewCallback object.
Location Data
Just having the camera feed for most augmented reality applications isn’t enough.
You’ll also need to determine the location of the device (and therefore its user). To do
this, you’ll need to access fine or coarse location information, commonly accessed
through the APIs available within the android.location package, with its Location
Manager class. This way, your application can listen to location events and use
those to determine where “live” items of interest are located in relation to the device.
Sensor Data
To determine the orientation of an Android device, you’ll need to leverage the APIS
available in the android. hardware. SensorManager package. Some sensors you’re likely
to tap include:
Sensor.TYPE_MAGNETIC_FIELD
Sensor.TYPE_ACCELEROMETER
Sensor.TYPE_ROTATION_VECTOR
The use of sensors to allow the user to move the device around and see changes on the
screen in relation to it really pulls the user into applications in an immersive
fashion. When the camera feed is showing, this is critical, but in other applications,
10
such as those exploring prerecorded image data (such as with Google Sky Map or Street
View), this technique is still very useful and intuitive for users.
Of course, the whole point of augmented reality is to draw something over the camera
feed that, well, augments what the user is seeing live. Conceptually, this is as simple
as simply drawing something over the camera feed. How you achieve this, though, is
up to you. You could read in each frame from of the camera feed, add an overlay to it,
and draw the frame on the screen (perhaps as a Bitmap or maybe as a texture on
a 3D surface). For instance, you could leverage the android. hardware. Camera.
Preview Callback class, which allows your application to get frame-by-frame images.
Alternately, you could use a standard Surface Holder with the android hardware.
Camera object and simply draw over the top of the Surface, as needed.
Finally, what and how you draw depends upon your individual application
requirements—there are either 2D or 3D graphics APIs available on Android, most
notably the APIs within the android graphics and android opening packages.
So where does the augmentation data come from? Generally speaking, you’ll either be
getting this data from your own database, which might be stored locally or from a
database online somewhere through a web or cloud service. If you have preloaded
augmentation data on the device, you’ll likely want to use a SQLite database for quick
and easy lookups; you’ll find the SQLite APIs in the android database, SQLite package.
For web-based data, you’ll want to connect up to a web service using the normal
methods: HTTP and (usually) XML parsing.
For this, you can simply use java.net.URL class with one of the XML parsing classes,
such as the XmlPullParser class, to parse the results.
11
reality could mean that people are missing out on what's right in front of them. Some
people may prefer to use their AR iPhone applications rather than an experienced tour guide,
even though a tour guide may be able to offer a level of interaction, an experience and a
personal touch unavailable in a computer program. And there are times when a real plaque
on a building is preferable to a virtual one, which would be accessible only by people with
certain technologies [6].
12
Chapter 4
Features and Design
The interior performance of the application is something way too easy to explain. This
application not only shows the interest point in 2D map but also label these 2D points to a 3D
environment. The 3D plotting is achieved through the camera view which makes it easier for
the user to check the relevant point in the real time environment. To achieve this goal in the
real time, the label updates the camera view, fetches data from the GPS, compass and
accelerometer data is fed into an algorithm which then helps us to convert the 2D plotting to
3D. This process makes the direction finding more unproblematic.
1. Hotels
2. Hospitals
3. Educational Areas
4. Shopping Centers
5. Airports
6. Railway Station
7. Police Station
8. Petrol Pumps
9. Entertainment areas
Option for viewing in: Figure 4.1: User Interface to select a Place
x Map Mode
x Augmented Reality Mode
13
4.2 Map mode
This option loads the Google map with annotations that display title of respective locations and
points the locations on map.
List view
Map view
It returns the places surrounding the user in the form of map view.
14
4.3 Augmented Reality mode
This option is to have a visual form a camera that displays the information using the concepts
of Augmented Reality.
Camera View
15
Chapter 5
Components and Architecture
GPS
Location data
W E
16
5.2 Hardware & Other Components
The following stature shows the hardware & other components that we used in
17
The GPS is getting the current location of a user. The surrounding information is well
known to the user when the current location is identified correctly.
The Accelerometer is checking the orientation of the cell phone.
Compass is getting the NEWS information.
The camera is used by the augmented.
Reality mode to display the information on the screen.
Windows 7 has been used as an Operating System for the making of the project.
5.5 Environment
Eclipse (HELIOS)
5.6 Interface
5.8 APIs
Google Maps API has been used for the development which plays an important role in the
application.
18
Chapter 6
Methodology
The figure shows the basic initiative and gives a synopsis of how LFA application works.
IV. Use device compass and accelerometer to set the device orientation
V. This information is then fed into an algorithm that maps these coordinates on the screen
19
Chapter 7
Implementation
In these methods, we will call the camera parameters defined in the android library.
20
7.3 Sending and receiving the location data
The process for sending and receiving of data would be as follows;
– The location data (latitude/longitude) received in the previous step is then sent to
some online database, in our case, Google data base.
– From there we then receive the information of all the places that are in the surroundings
of the user in a certain format. In our case the format is known as JSON.
– The JSON format is then read by a JSON handler class that deals with this process.
– The surrounding location data is then stored in the form of arrays after being received.
– The data received may have information like the title, address, vicinity, phone number.
We are interested in is the direction to the object relative to the device. In navigation, this is
called the bearing. A path between two points on a spherical object, such as the planet Earth, is
shortest by following what’s called the great-circle distance. In using AR techniques, to find a
location or display where some fixed object might be in relation to our location, this would be
reasonable distance mechanism to use. As we get closer, the bearing might change, but we
would still be following the shortest route to that location.
An alternate solution might be to calculate a straight line, in 3D space, between two points on
the sphere. This method assumes that a path through the sphere, as opposed to around
its surface, but this method is not in the least useful.
We have used the great circle distance to calculate the distance. It can be found out by using
Haversine formula. The said formula is used to calculate the distance between two points on a
sphere from their latitudes and longitudes.
21
We need to get distance and bearing because ultimately we want the locations to be
determined from a database or feed. We can then filter relevant objects to display on the view
by distance.
This requires taking into account the orientation of the device, which uses a combination of the
accelerometer and compass. The end result will be a vector with rotation angles around each of
the 3 axes. This precisely orients the device with respect to the planet which is exactly
what we want.
The first method we can use is located in the SensorManager class: getOrientation().
This method takes a rotation matrix and returns a vector with azimuth, pitch, and roll values.
The azimuth value is rotation around the Z-axis and the Z-axis is that which points straight
down to the center of the planet. The roll value is the rotation around the Y-axis, where the Y-
axis is tangential to the planetary sphere and points towards geomagnetic North. The pitch value
is the rotation around the X-axis, which is the vector product of the Z-axis and Y-axis –it points
to what could be called magnetic West.
22
Remap the rotation matrix so that the camera is pointed along the positive direction of the Y
axis, before calculating the orientation vector.
Now we have all of the information needed to place the target on the screen: the direction the
camera is facing and the location of the target relative to our location. We currently have two
vectors: the camera orientation and the bearing to the target. What we need to do is map the
screen to a range of rotation values, and then plot the target point on the View when
its location is within the field of view of the camera image as displayed on the screen.
– The orientation vector starts out with the azimuth, or the rotation around the Z-axis, which
point straight down.
– With the Y-axis pointing to Geomagnetic North, the azimuth will be compared to our bearing
and this will determine how far left or right the target is on the screen.
– Similarly, the pitch is used to determine how far up or down the target should be
drawn on the screen.
– Finally, the roll would change the virtual horizon that might show on the screen; that is, this
would be the total screen rotation.
The Camera Parameters class can provide the field of view of the device camera, as
configured by the manufacturer. Now we can apply the appropriate rotation and translation to
place the target point correctly, relative to the screen and field of view of the camera. Basically,
what we do is rotate by the roll value; translate up and down by the pitch value, and then draw
a horizon line. Then the azimuth is used for left or right translation of the field of view and the
location is drawn where expected.
23
Chapter 8
Results and Conclusions
8.1 Results
Augmented Reality:
24
The snapshot shows the "Hospital" search in Augmented Reality Mode.
The snapshot shows the "Petrol Pumps" search in Augmented Reality Mode.
Map Mode:
The snapshot shows the "Universities" search
in map mode.
25
The snapshot shows the "Colleges" search in Map Mode.
26
List View:
The snapshot shows the "Hospitals" search in List View. List view can be seen by the Option
Button in android smart phone.
27
8.2 Conclusion
This application being a consumer based product, aimed at enhancing the user
experience. The designed application uses the concepts of augmented reality and
Android phone’s sensor data to achieve 3-D registration in real time. All information is
engrossed on live camera feed. Augmented Reality is more supportive for getting some
additional information. This information is then displayed on Smartphone's screen to get
the required response depending on the type of query given. The 3-D representation is more
likely appreciated which helps the user to get to a specific place. In Augmented Reality
mode, the places are assigned with a tag [8]. Those tags are then shown when a user directs
a camera in a specific direction.
28
Chapter 9
References
2. Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality ‘99,
pp.85-94, October 20-21, 1999 San Francisco
4. Wendy E. Mackay, Anne-&am-e Fayard, Laurent Frobert and Lionel Mtfdini Centre
d’Etudes de la Navigation Atkienne Orly Sud 205 94542 ORLY AEROGARES FRANCE
5. Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook –Moving seamlessly
between reality and virtuality. IEEE Computer Graphics and Applications, 21(3), 6-8.
6. http://www.howstuffworks.com/augmented-reality.htm
7. http://developer.android.com/guide/topics/location/index.html
8. http://onvert.com/guides/what-is-augmented reality/#sthash.8nLqcQ6x.dpuf
29