0% found this document useful (0 votes)
27 views11 pages

(English) Understanding Sensor Fusion and Tracking, Part 2 - Fusing A Mag, Accel, & Gyro Estimate (DownSub - Com)

Uploaded by

Phan Minh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views11 pages

(English) Understanding Sensor Fusion and Tracking, Part 2 - Fusing A Mag, Accel, & Gyro Estimate (DownSub - Com)

Uploaded by

Phan Minh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 11

in this video we're gonna talk about how

we can use sensor fusion to estimate an

object's orientation now you may call


orientation by other names like attitude

or maybe heading if you're just talking


about direction along a 2d plane this is

why the fusion algorithm can also be


referred to as an attitude and heading

reference system but it's all the same


thing we want to figure out which way an

object is facing relative to some


reference and we can use a number of

different sensors to do this for example


satellites can use star trackers to

estimate attitude relative to the


inertial star field whereas an airplane

could use an angle of attack sensor to


measure orientation of the wing relative

to the incoming free air stream now in


this video we're gonna focus on using a

very popular set of sensors that you're


gonna find in every modern phone and a

wide variety of autonomous systems a


magnetometer and accelerometer and a

gyro and the goal of this video is not


to develop a fully fleshed-out inertial

measurement system there's just too much


to cover to really do a thorough job

instead I want to conceptually build up


the system and explain what each sensor

is bringing to the table and a few


things to watch out for along the way

I'll also call out some other really


good resources that I've linked to below

where you can dive into more of the


details so let's get to it

I'm Brian and welcome to a MATLAB Tech


Talk when we're talking about

orientation we're really describing how


far an object is rotated away from some
known reference frame for example the
pitch of an aircraft is how far the

longitudinal axis is rotated off of the


local horizon so in order to define an

orientation we need to choose the


reference frame that we want to describe

the orientation against and then specify


the rotation from that frame using some

representation method and we have


several different ways to represent a

rotation and perhaps the easiest to


visualize and understand at first is the

idea of roll pitch and yaw and this


representation works great in some

situations however it has some


widely-known drawbacks in others so we

have other ways to define rotations for


different situations things like

Direction cosine matrix YZ and the


quaternion now the important thing for

this discussion is not what a quaternion


is or how it

is formulated but rather just to


understand that these groups of numbers

all represent a three dimensional


rotation between two different

coordinate frames the object's own


coordinate frame that is fixed to the

body and rotates with it and some


external coordinate frame and it's this

rotation or these sets of numbers that


were trying to estimate by measuring

some quantity with sensors so let's get


to our specific problem let's say we

want to know the orientation of a phone


that's sitting on a table so the phone's

body coordinate frame relative to the


local northeast and down coordinate

frame we can find the absolute


orientation using just a magnetometer

and an accelerometer now a little later


on we're gonna add a gyro to improve

accuracy and correct for problems that


occur when the system is moving but for

now we'll just stick with these two


sensors simply speaking we could measure

the phone's acceleration which would


just be due to gravity since it's

sitting stationary on the table and we


would know that that direction is up the

direction opposite the direction of


gravity and then we can measure the

magnetic field in the body frame to


determine north but here's something to

think about the mag field points north


but it also points up or down depending

on the hemisphere year-end and it's not


just a little bit in north america the

field lines are angled around sixty to


eighty degrees down which means it's

mostly in the gravity direction now the


reason a compass points north and not

down is that the needle is constrained


to rotate within a 2d plane however our

mag sensor has no such constraint so


it's going to return a vector that's

also in the direction of gravity so to


get north we need to do some cross

products
we can start with our measured mag and

Excel vectors in the body frame down is


the opposite direction of the

acceleration vector and then East is the


cross-product of down and the magnetic

field and finally north is the


cross-product of East and down so the

orientation of the body is simply the


rotation between the body frame and the
Northeast down frame and I can build the
direction cosine matrix directly from

the north east and down vectors that I


just calculated let's go check out an

implementation of this fusion algorithm


I have a physical IMU it's the MP u 9250

and it has an accelerometer magnetometer


and gyro although for now we're not

going to use the gyro I've connected it


to an Arduino through I squared C which

is then connected to MATLAB through USB


I've pretty much just followed along

with this example from the math works


website which provides some of the

functions that I'm using and I've linked


to below if you want to do the same but

let me show you my simple script first I


connect to the Arduino and the IMU and

I'm using a MATLAB viewer to visualize


the orientation and I update the viewer

each time I read the sensors this is a


built-in function with the sensor fusion

and tracking toolbox the small amount of


math here is basically reading the

sensors performing the cross products


and building the DCM and that's pretty

much the whole of it


so if I run this we can watch the

algorithm in action

notice that when it's sitting on the


table it does a pretty good job of

finding down it's in the positive x-axis


and if I rotate it to another

orientation you can see that it follows


pretty well with my physical movements

so overall pretty easy and


straightforward right

well there are a few problems with this


simple implementation I want to
highlight two of them the first is that
accelerometers aren't just measuring

gravity they measure all linear


accelerations so if the system is moving

around a lot it's gonna throw off the


estimate of where down is you can see

here that I'm not really rotating the


sensor much but the viewer is jumping

all over the place and this might not be


much of a problem if your system is

largely not accelerating like a plane


while it's cruising at altitude or a

phone that's sitting on a table but


linear accelerations aren't the only

problem even rotations can throw off the


estimate because an accelerometer that's

not located at the center of rotation


will sense an acceleration when the

system rotates so we have to figure out


a way to deal with these corruptions the

second problem is that magnetometers are


affected by disturbances in the magnetic

field obviously you can see that if I


get a magnet near the IMU the estimate

is corrupted so what can we do about


these two problems well let's start with

this magnetometer problem if the


magnetic disturbance is part of the

system and rotates with the magnetometer


then it can be calibrated out these are

the so-called hard iron and soft iron


sources a hard iron source is something

that generates its own magnetic field


this would be an actual magnet like the

ones in an electric motor or it could be


a coil that has a current running

through it from the electronics


themselves and if you tried to measure

an external magnetic field a hard iron


source near the magnetometer would
contribute to the measurement and if we
rotate the system around a single axis

and measure the magnetic field the


result would be a circle that is offset

from the origin so your magnetometer


would read a larger intensity in one

direction and a smaller intensity in the


opposite direction a soft iron source is

something that doesn't generate its own


magnetic field

but is what you would call magnetic you


know like a nail that is attracted to a

magnet or the metallic structure of your


system this type of metal will bend the

magnetic field as it passes through and


around it and the amount of bending

changes as that metal rotates so a soft


iron source that rotates with the

magnetometer would distort the


measurement creating an oval rather than

a circle so even if you had a perfect


noiseless magnetometer it's gonna still

return an incorrect measurement simply


because of the hard and soft iron

sources that are near it and your phone


and pretty much all systems have both of

those so let's talk about what we can do


with calibration if the system had no

hard or soft iron sources and you


rotated the magnetometer all around in

for Paiste Radian directions then the


magnetic field vector would trace out a

perfect sphere with the radius being the


magnitude of the field now a hard iron

source would offset the sphere and a


soft iron source would distort it into

some odd shaped spheroid if we can


measure this distortion ahead of time we

could calibrate the magnetometer by


finding the offset and transformation

matrix that would convert it back into a


perfect sphere centered at the origin

this transformation matrix and bias


would then be applied to each

measurement essentially removing the


hard and soft iron sources this is

exactly what your phone does when it


asks you to spin it around in all

directions before using the compass here


I'm demonstrating this by calibrating my

IMU using the MATLAB function mag Cal


I'm collecting a bunch of measurements

in different orientations and then


finding the calibration coefficients

that will fit them to an ideal sphere


and now that I have an a matrix that

will correct for soft iron sources and a


B matrix that will remove hard iron bias

I can add a calibration step to the


fusion algorithm that I showed you

previously and this will produce a more


accurate result than what I had before

all right now let's go back to solving


the other problem of the corrupting

linear accelerations and one way to


address this is by predicting linear

acceleration and removing it from the


measurement prior to using it and this

might sound difficult to do but it is


possible if the acceleration is the

result of the system actuators you know


rather than an unpredictable external

disturbance what we can do is take the


commands that are sent to the actuators

and play it through a model of the


system to estimate the expected linear

acceleration and then subtract that


value from the measurement this is
something that is possible if say your
system is a drone and you're flying it

around by commanding the four propellers


now if we can't predict the linear

acceleration or the external


disturbances are too high another option

is to ignore accelerometer readings that


are outside of some threshold from a 1g

measurement if the magnitude of the


reading is not close to the magnitude of

gravity then clearly the sensor is


picking up on other movement and it

can't be trusted this keeps corrupted


measurements from getting into our

fusion algorithm but it's not a great


solution because we stopped estimating

orientation during these times and we


lose track of the state of the system

again it's not really a problem if we're


trying to estimate orientation for a

static object this algorithm would work


perfectly fine however often we want to

know the orientation of something that


is rotating and accelerating so we need

something else here to help us out what


we can do is add a gyro into the mix to

measure the angular rate of the system


in fact the combination of magnetometer

accelerometer and gyro are so popular


that they're often packaged together as

an inertial measurement unit like I have


with my MP you 9250 so the question is

how does the gyro help to start I think


it's useful to think about how we can

estimate orientation for a rotating


object with just the gyro on its own no

accelerometer and no magnetometer for


this we can multiply the angular rate

measurement by the sample time to get


the change in angle during that time and
then if we knew the orientation of the
phone at the previous sample time we

could just add this Delta angle to it


and have an updated estimate of the

current orientation and if the object


isn't wrote

then the Delta angle would be zero and


the orientation wouldn't change so it

all works out and by repeating this


process for the next sample time and the

one after that we're going to know the


orientation of the phone over time this

process is called dead reckoning and


essentially it's just integrating the

gyro measurement now there are downsides


to dead reckoning one you still need to

know the initial orientation before you


begin so we need to figure that out and

two sensors aren't perfect


they have bias and other high frequency

noises that will corrupt our estimation


now integration acts like a low-pass

filter so that high frequency noise is


smoothed out a little bit which is good

but the result drifts away from the true


position due to random walk as well as

integrating any bias and the


measurements so over time the

orientation will smoothly drift away


from the truth

so at this point we have two different


ways to estimate orientation one using

the accelerometer and the magnetometer


and the other using just the gyro and

each way have their own respective


benefits and problems and this is where

sensor fusion comes in once again we can


use it to combine these two estimates in

a way that emphasizes each of their


strengths and minimizes their weaknesses

now there's a number of sensor fusion


algorithms that we can use like a

complementary filter or a common filter


or the more specialized but common magic

or Mahony filters but at their core


every one of them does essentially the

same thing they initialize the attitude


either by setting manually or using the

initial results of the Magon


accelerometer and then over time they

used the direction of the mag field and


gravity to slowly correct for the drift

in the gyro now I go into a lot more


detail on this in my video on the

complimentary filter and MathWorks has a


series on the mechanics of the common

filter both are linked below but just in


case you don't go and watch them right

away let me go over a really high-level


concept of how this blending works let's

put our two solutions at opposite ends


of a scale that represents our trust in

each one and we


can place a slider that specifies which

solution we trust more if the slider is


all the way to the left then we trust

our mag and Excel solution 100% and we


just use that value for our orientation

all the way to the right and we use the


dead reckoning solution 100% when the

slider is in between this is saying that


we trust both solutions some amount and

therefore want to take a portion of one


and add it to the complimentary portion

of the other by putting the slider


almost entirely to the dead reckoning

solution were mostly trusting the


smoothness and quick updates of the
integrated gyro measurements which gives
us good estimates during rotations and

linear accelerations but we are ever so


gently correcting that solution back

towards the absolute measurement of the


mag in Excel to remove the bias before

it has a chance to grow too large so


these two approaches complement each

other now for the complimentary filter


you as the designer figure out manually

where to place this slider how much you


trust one measurement over the other but

with a common filter the optimal gain or


the optimal position of the slider is

calculated for you after you specify


things like how much noise there is in

the measurements and how good you think


your system model is so the bottom line

is that we're doing some kind of fancy


averaging between the two solutions

based on how much trust we have in each


of them now if you want to practice this

yourself the MATLAB tutorial I used


earlier goes through a common filter

approach using the function a hrs filter


and that's where I'm going to leave this

video in the next video we're gonna take


this one step further and add GPS and

show how our IMU and orientation


estimate can help us improve the

position that we get from the GPS sensor


so if you don't want to miss that or

other future Tech Talk videos don't


forget to subscribe to this channel also

if you want to check out my channel


control system lectures I cover more

control theory topics there as well I'll


see you next time

You might also like