Lecture 7 notes
Lecture 7 notes
Markov Inequality
✓ Often, given a random variable X whose distribution is unknown but
whose expected value µ is known,we want to know “What is the
probability that the value of the r.v., X, is ‘far’ from its expectation
This would give us some idea of the spread of the distribution
✓ A generic answer to this, which holds for any non-negative random
variable, is given by Markov’s inequality:
✓ If X is a random variable that takes only non-negative values, then for any
value a>0,
E( X )
p X a
Proof a
We give a proof for the case where X is continuous with density f(x).
E( X ) = x. f ( x)dx
−
=
x. f ( x)dx
0
since x≥0
Continuation…
a
= x. f ( x)dx + x. f ( x)dx
0 a
p ( x − k ) 2
1
k
p ( x − k )
1
Where k 2 is the upper bound of
Proof
Example
A random variable X has a density function given by:
2e−2 x , x0
f ( x) =
0, elsewhere
Use Chevychev’s inequality to obtain an upper bound on p (|x-μ|≥1)
Solution
First find your mean μ = E(x)
We will use the Mgf method
✓ We know that kσ = 1
= E ( X ) − E ( X )
2 2 2
1
=𝐸 𝑋2 −
4
Now lets find 𝐸 𝑋 2
Continuation…