0% found this document useful (0 votes)
2 views

Lecture 7 notes

The document discusses Markov and Chebyshev's inequalities, which provide bounds on the probabilities of random variables deviating from their expected values. Markov's inequality applies to non-negative random variables, while Chebyshev's inequality applies to random variables with finite mean and variance. Examples illustrate the application of these inequalities in calculating upper bounds for specific scenarios involving random variables.

Uploaded by

bonvicky119
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Lecture 7 notes

The document discusses Markov and Chebyshev's inequalities, which provide bounds on the probabilities of random variables deviating from their expected values. Markov's inequality applies to non-negative random variables, while Chebyshev's inequality applies to random variables with finite mean and variance. Examples illustrate the application of these inequalities in calculating upper bounds for specific scenarios involving random variables.

Uploaded by

bonvicky119
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Lecture 7: Markov and Chebyshev's inequality.

Markov Inequality
✓ Often, given a random variable X whose distribution is unknown but
whose expected value µ is known,we want to know “What is the
probability that the value of the r.v., X, is ‘far’ from its expectation
This would give us some idea of the spread of the distribution
✓ A generic answer to this, which holds for any non-negative random
variable, is given by Markov’s inequality:
✓ If X is a random variable that takes only non-negative values, then for any
value a>0,
E( X )
p  X  a 
Proof a
We give a proof for the case where X is continuous with density f(x).

E( X ) =  x. f ( x)dx
−

=
 x. f ( x)dx
0
since x≥0
Continuation…
a 

=  x. f ( x)dx +  x. f ( x)dx
0 a

Thus we conclude that:


Example
A biased coin, which lands heads with probability 1/10 each time it is flipped, is
flipped 200 times consecutively. Give an upper bound on the probability that it
lands heads at least 120 times.
Solution
The number of heads is a binomially distributed r.v., X, with parameters p = 1/10
and n = 200.
Thus, the expected number of heads is E(X) = np = 200 · (1/10) = 20.
By Markov Inequality we have that:
Chebyshev’s inequality
✓ Chebyshev’s inequality gives a bound on the probability that X is far from it’s
expected value
✓ Suppose that X is a random variable (discrete or continuous) having mean μ
and variance σ2 which are finite. Then for any positive number k

p ( x −   k )  2
1
k

p ( x −   k )
1
Where k 2 is the upper bound of
Proof
Example
A random variable X has a density function given by:

2e−2 x , x0
f ( x) = 
0, elsewhere
Use Chevychev’s inequality to obtain an upper bound on p (|x-μ|≥1)
Solution
First find your mean μ = E(x)
We will use the Mgf method

The moment generating function


✓ To get the mean , we find the first derivative of the mgf

✓ We know that kσ = 1

 = E ( X ) −  E ( X )
2 2 2

1
=𝐸 𝑋2 −
4
Now lets find 𝐸 𝑋 2
Continuation…

You might also like