RVSP Unit 2
RVSP Unit 2
momets.
TYPes momeyis ae
n cose
ofm A)/n]
n caSe of u- 2 M2
fCr-A)"/1
Momet about oigin
case when we take ay crbitary point A o'
then oe get moment about mg'n. which can be
lefinedas
which can be cqual to
ECX]
3n case of no
o
n case of =
- P ] -E[7 ey
n case n
m2
Crnean)
[ cx-o/ P] - Er*
Moment about mean (oY) Central meoan
ohen voe take the deviation from the actual
e a n and calculate the moments, this i s now
as moment about mean CoT) CenLral mea
Moment about mean can be defined as M
case of = 1
M E[x -] - Elx -x] ELJ-E[
XX =
caSe ofn2
2 E x - ] Vartance
n coSe n3
M3 E Ix-X] Skewne.
NOte
Sn case
of eay
a) Ex d
1) I ncse or continuOus
fom)
bEX-*pc) C3 caSe of iscvcic
fom)
c) case a vaiamce
Ex- f Er) d (3 cose
of
Contiuous
The above terms which Cre
defied fovm)
in continuouS fOT are
Obtaine d fro
EX1 fCr) dz
)3.3 Properties of Expectation
X=a then E[a] where a is a constant.
(1) Ifa random variable Xis a constant, i.e., =
a,
(3.8)
Proof
We know that
ELX) =f , (x\dr
afy(x)dr
=a d
= axl= a
E[a = a
Elax] =
=aE[X]|-b
(4) If X 20, then ETXJ20.
Proof
LX S,(rdr=fryd -f,(xd
01rd
Since0 and f,(x) 2 0. Then
EX] =f , (z)dr e 0
Proof
EX]E X
(6) If g(x) and g,(r) are two functions of a random variable X. then
Example 3.2
IfX is a discrete random variable with probability mass function givenias in Table 3.1, find
) E[X], () E[2X +3]. (ii) E[X*], and (iv) E[(2X +1?'].
X-2-10 1 | 2
PX)
10 10
ELX]=xPC)) x =
i=l
P)
22,1.2
ELX] -0.3
(ii) E[2AX+3]
From the properties, we know that
E2X +3] = 2E[X]+3
=2-0.3]+3=-0.6+3=2.4
(ii) To find out E[ X], let g(x) = X.
Then, we know that
Elet))=s)px) = Pa)
ELX'] =2.1
(iv) To find out E[(2X + 1)*], let g(x) = (2.X +1)' = 4X* +4X +1
-4L10 10
1-4121-73
1010l=i7.3
/3.5.3 Properties of Variance
constant, then 0 (3.26)
The variance of a constant is zero, i.e., if kis a var (&)
=
(1)
Proof
var (X) =
E[(X -
X}]
var(k) E[(k -k]= E[tk -k)*] E[O]
= = = 0
Proof
The variance of k X is given by
var(kX) =
E|(KX- k XY'1
EJk (N-X}]
=k E|(X -X¥|
var (kX)=k var( X)
(3)
(3) For a given random variable X, the relationship between the variance and the momens
is given by
(3.25)
Proof
o=E[(X-X]
E[X+X -2X X]
E[X'J+E;X 1-2E[X X]
EX]+X -2X ELX]
= ELX']+X -2XX
O =E[X*]-X
or x =m-m
random variable and a, b are real constants, then
(4) If X is a
Proof
The variance of (aX +b) is
var(a X+b) =
E[(aX +b)-(aX +b))]
va
=a'E{{X -X1
=a var(X)
and X, are independent, then
f random variables
Itwo X,
var(X +X,)
=
var(X,)+var(X,)
and var (X, -X,) = var(X,) +var(X)
Proof
var(X +X)
=
E[(X, +X,)- (X, +X,))]E
X,}1
=E(X-X,)+(X, -X,)*]
var(X+X,) =E{(X, -X]+EI(X, -X,P]+2E[(X,-X, MX, -X, ))
Since X, and X, are independent,
EX-XMx, -R,) -ELX, -F ELX, -F,]-( -¥), -¥)=0
Therefore, var(X, +X,) = var(X)+var (X,
Similarly, var(X, -X,) = var(X,) + var (-X,)
=
var(X,) +(-1)° var(X,)
var(X-X,) = var X, + var (X,)
3.5.4 Relationship between Central Moments and Moments about the Origin
Let X be a random variable. Then the nth central moment is given by
=E[CX - Xy]
We know from binomial theorem that,
EX--E-CXx =0
k=0
X =
m, =
mean value
and E[X"-*] m, =
=(n-k)" moment
ET(X-XY] =
4, -
a-4,--Cm m
=(-1 Cm"m, +(-1) 'Cm,n, +(-1
*C,mjm
m 1 , -2mj ++m, "m,
But m=1
m , -m
T7=s,
k=0
=
(-1° "Cm'm, +(-1)' 'C,m,m, + (-1)* °C,m,m,
+(-1" °C3m, m,
We know that m, =1
m-3m,m, +3 m - m
m,-3m,(m, -m)-m
In terms of oj, we know that
O m, - m,
3
So m-3m, ox-m
4 X-3Xo -F
O7) 4, can also be obtained directly from
=E[X--3Xx*+3X'x]
=
E[X']-EX'J-3E[X X']+3ELX
=ELx'1-F-3FELX*]+3X'T
= E[X']-*-3XE;X? -F'j
nis
Cpression gives skew of the density function when moments are knoWn.,,
Examnia
3.6 Functions for Moments
To calculate the nth moments of a random variable X, two functions are generally used:
Characteristic function and (2) moment generation function.
(1)
3.6.I Characteristic Function
Consider a random variable X with a probability density function f,(x), Then the expected
value of the function e i s called the characteristic function. It is expressed as
x ()=Ele*] (3.33)
It is a function of the real variable, -oo < o<o, where j is an imaginary operator.
O () = e Srdr (3.34)
and for a discrete random variable
o, (o) = e p(x)
(3.35)
The characteristic function transforms the random variable X into another real variable o
It can be expressed as a Fourier transform of Í(x) with the sign of o reversed.
Therefore, the inverse Fourier transform of o, (o) gives the probability density function
with the sign of x is reversed. Hence
(3.36)
S - e ,(o)e" do
So the functions , (w) and f,(x) are Fourier transform pairs with the sign of the variable
reversed.
Theorem
If o, (o) is a characteristic function of a random variable X, then the-x moment of X is
given by
(3.37)
d"
a-0
Proof
Consider a random variable X with the characteristic function
oy ( ) = E[e
Then
n'X t..
j»X
, ()= Ele0= E|1+ 2 3!
= I+ joE[X]-EX]-/E[X']+..
2 3!
( ) = m, + jom,- 2
/a " t
To find the moments, substitute ø = 0.
0 ) = m, =l|
So,
Differentiation of o , ) with respect to o gives
3w3
dox )= jm-0m,
d 3
At w=0,
do, () = jm
do l@=0
d o)
dw
=
-m, -
jom,
At o =
0
dw m,=j"m,
the characteristic
function at o=0.
the Characteristic Function
6.2 Properties of
The characteristic function is unity at w =0, and given by
o ()=@,(0) = 1 (3.38)
Proof
The characteristic function is given by
( ) =[e
At o=0, e0) =
E[e"']= E[|] =
1
ie,
e. 10,()I so,(0)
OT ( ) s1 (3.39)
Proof
Let the characteristic function
x ) =E[e
The amplitude of o,(0) 1s
0,() = E]e|
Since XY SX
||Y
lo,()s S , d
Since e =1
9,(si
(3) 0)1S a continuous function of o in the range -o<< o. (3.40)
Proof
We know that,
, () eS, (r)dr
Since w s continuous, e is also continuous. Hence o, () is a continuous function.
(o) =E[]
-o)-E[el]
o-)-E[e]=E[eT
Ox-) =, ( 0 )
Similarly,
- ) = El el-
-)=E|T
o - ) =Ele-ox
-)=El*]
- ) =, ()
(5) Ifo, (o) is a characteristic
function of a random variable X, then the
function of Y =aX +b is
characteristic
given by
, (0)= e , (aw), where a and b are real constants.
Proof (3.42)
Given Y aX +h
Then , () =
E| emdal +b)
= Eea
, () =eo, (wa)
If , ()is a characteristic function of a random variable X, then o,(co) = 0, (0),
(6)
where c is a real constant. (3.43)
Proof
We know that
o()= Ee
then x(co) = Ele e
x(co) = r ()
Given () =Ee
Then xx,() = E|ea**3)|=E|eleeax
Since X, and X, are independent,
-E[eE[*]
x-x, ()=O», (0) Ox, ()
Example 3.5
Find the characteristic function of a uniformly distributed random variable X in the range
0,1] and hence find m
Solution
For a uniformly distributed random variable, the density function is
1
aSx sb
Sr(x) = b-a
0 otherwise
e d
j
y ()=l e-
r
1
jw
To find m,, we know that
do, ()
m=(-/) do lo0
do ()=
do
At w=0,
m = (-)
3.6.3 Moment Generating Function
The moment generating function (MGF) of a random variable is also used to
generate the
moments about the origin.
Considera random variable X with a probability density function fx). Then the moment
generating function of X is detined as the expected value of the function e*, It can be
expressed as
MyV) = E[e"]
(3.45)
where V is a real variable -o <V < o.
Theorenm
d"M,(v)
m= (3 48)
dv
Proof
Consider a random variable X with moment generating function
M,) = E
e =1-vX-A.VA.VA
2 3! n
M,) = E " ]
3 n
M,0)-E)-vELX)+EX-LX n -
M,(v) m, +Vm, m+m, . . . m
To find the moments, substitute v = 0.
So My(0) =m, =1
At v = 0, dM , (v
dv v =0
At v =0. m d M, (
dv v = 0
d'M,)
dv v =0m,
The nth moment of X is given by
m,
d'M,()
dv" V =0 Hence proved.
Thus the nth moments of X can be obtained from the nth differentiation of MGF at v =0.
M,0) =1
(2)
(2) LetXbe a random variable with moment
generating function M,(v). Then the moment
generating function for Y =aX +b is given by
My (V) =e" M, (av)
(3.50)
Proof
We know that
M,(V) = Ee*]
Then M,() -
E["]- Efe'ar-y
E
-"E
M,(V)"M, (v)
of a random variable X, then
is a moment generating function
if M,a)
(3.51)
, ) =Ma(V),
where c is a real constant.
Proot
ATnOW that
M , ) = E[e"]
Proof
e know that
M(V) = E[e"*]
Mxx,) = Ele *]
M, (V) =
Mx (v) M, (v). My, (V) (3.53)
,
trample 3.6
plf of arandom variable is given by
Ix) =e' for x 20
M,v ), m and m
Solution
We know that
M,) =
Ele"]= ["o)dr
=eed
-(1-v)* |
= e"dr =
1-v 10
M,) =;1-v
Now m M =0
=
(1-v)v=0
=|
and m
_d M.)| 2
dv2 v=0 (1-v)'|v==0
2
3.7 Inequalities
There are three important inequalities which are very useful in solving
some types of probability
problems
(1)Chebychev's inequality
(2) Markov's inequality
(3) Chernoff's inequality and bound
Chebys hev nequalty
Lwi h mean
1 landom Nanible
and Vaonience Ce:),1hen
and
PIx1t|>ke} e
1andom Vanib
Tot Conside X i a
Vantence C then
mean Cu) ane
Ex Elx1]
EX
,
At-K
-K
71om density ptoPerty f 0,
7
U-
1-K
7 7/ C-AD,d+ (-109,)dz
-O
let Consile
-k
- 2-k
>- K
>(-1 - A
Conside1,
+k
X- kc
(-1)K
7 Ke -
Subsitut A)and (B n quation (1)
7 U-K
CK)f)
-k
-K
Cd+ da
Pxe-ke}+PP X + ke
PX-JN2-ke 4+ Px- k
PRx- ke
Px-2k
PRIx- Ka-k
2
>1Px-
PIx-2K 1k
P
PRx-4 2k -k
Hence Po Ved
3.7.2 Markov's Inequality
Consider a continuous random variable X with pdf f(x). If fcr) = 0 for x <0, then the
Proof
We know that
P(X2a) = [smd* (3.60)
also since SAx) =0 for x <0
X =E[X]= xf(r)d
(3.61)
Let x = a
Xaf (xd
a)dr s
From (3.60),
Note
' (3.62)
If a=X, then P(X2 X) S1
.
3.8 Transformations of a Random Variablee
Transtornation is used to convert a given random variable X into another random variable
YIt is denoted as
Y =T(X)
where T represents transfomation. It may be linear, non-linear, staircase or segmented.
The transtormation of X to Yis shown in Fig. 3.3.
Y
f) T(X)
Fig. 3.3 Transformation of X to Y
Here we need to consider three cases:
Both X and T are continuous and 7 IS ether
nmonotonically inereasing or decreasing
with X
2 Both X and 7 are continuous and T is non-monotome,
any <
Monotonically increasing function:
Assume that the transformation T is continuous and differentiable for all values ofx with
f,(a) =0. Let another random variable Yhave a value y, corresponding to x, of Xas shown
in Fig. 3.4(a).
The transformation is given as
Y =T(X) (3.67)
Yo =T(x%) or
T(o) (3.68)
where T represents the inverse of the transformation T.
Since transformation provides a one-to-one correspondence between X and Y, the
probability of the event {¥ Sy,} must be equal to the probability ofthe event {X sx,}
Thus KT(,) )dT
P{Y SY}= P{X Sx
F(v) = Fr(r,) (3.69)
or
(3.70)
Using Leibniz's rule, and differentiating both sides with respect to y
dy
dy - dy , Cxd
or ) =f(T 'o,))dTy)
dy
for any Yo
J(y)=1, (7 ')) ()
dy
dx (3.71)
Or J(P) = f, (a)
dy
Monotonically decreasing function:
shown in Fig. 3.4(6), we know thasat
Similarly, for a monotonically decreasing function as
Yo
F ) =1-Fr()
or S d y =1-S)d
(3.74)
dy
For a monotonic transformation, either increasing or decreasing, the density functot
of y is
de (3.75)
3.8.2 Non-Monotonic Transformation of a Continuous Random Variabla
Consider that a random variable Yis a non-monotonic transformation of a random variable y
as shown in Fig. 3.5.
X X2
Fig. 3.5 Non-monotonic function y
For given event {y s y}, there
a is more than one value of X. From Fig. 3.5, it is found
that the event {ys y}corresponds to the events {X Sx, and x, <X < x}.
Thus the probability of the event {Y < y,} is equal to the probability
of the event {X |Y Sy,}
P{Y S yo} = P{X | Y S yo}
(3.76)
or
(3.77)
By differentiating, the density function is given by
dF) .
dy dy, X/YSy%)