0% found this document useful (0 votes)
17 views20 pages

Lecture 18

The document is a lecture note on Probability and Statistics by Pradeep Boggarapu, covering key concepts such as Markov's and Chebyshev's inequalities, along with examples and problems. It explains how to apply these inequalities to find probabilities related to random variables, including Poisson and binomial distributions. Additionally, it discusses normal approximation to the binomial distribution and provides exercises for practice.

Uploaded by

motusbanthia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views20 pages

Lecture 18

The document is a lecture note on Probability and Statistics by Pradeep Boggarapu, covering key concepts such as Markov's and Chebyshev's inequalities, along with examples and problems. It explains how to apply these inequalities to find probabilities related to random variables, including Poisson and binomial distributions. Additionally, it discusses normal approximation to the binomial distribution and provides exercises for practice.

Uploaded by

motusbanthia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Probability and Statistics (MATH F113)

Pradeep Boggarapu

Department of Mathematics
BITS PILANI K K Birla Goa Campus, Goa

February 20, 2025

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 1 / 20
1 Markov’s Inequality

2 Chebyshev’s Inequality

3 Chebyshev’s Inequality

4 Some Problems

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 2 / 20
Statement of Markov’s Inequality

Markov’s Inequality
For a non-negative random variable X , if E [X ] exists, the
Markov’s inequality gives an upper bound to P[X ≥ ϵ] for
every ϵ > 0. It states that
E [X ]
P[X ≥ ϵ] ≤
ϵ

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 3 / 20
Check for Discrete Case

Proof of Markov’s inequality in discrete case


X X X
E [X ] = xp(x) = xp(x) + xp(x)
x x:x<ϵ x:x≥ϵ
X
≥0+ϵ p(x) = ϵP[X ≥ ϵ]
x:x≥ϵ

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 4 / 20
Check for Continuous Case

Proof of Markov’s inequality in continuous case


Z ∞ Z ϵ Z ∞
E [X ] = xf (x)dx = xf (x)dx + xf (x)dx
−∞ −∞ ϵ
Z ∞
≥0+ϵ f (x)dx = ϵP[X ≥ ϵ]
ϵ

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 5 / 20
Example

Example 1
If X ∼ Exp(1), show that P[X ≥ 2] ≤ 21 .

Proof.
If X ∼ Exp(1), E [X ] = 1. Take ϵ = 2 and apply Markov’s
inequality to X . Note that X is a non-negative random
variable, thus,
E [X ] 1
P[X ≥ 2] ≤ =
2 2

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 6 / 20
Statement of Chebyshev’s Inequality
Chebyshev’s Inequality
For a random variable for which E (X ) and Var (X ) exists, we have
Var (X )
P[|X − E [X ]| ≥ ϵ] ≤
ϵ2
or
Var (X )
P[|X − E [X ]| < ϵ] ≥ 1 −
ϵ2
whenever Var (X ) exists. This is popularly known as Chebyshev’s
Inequality.

Advantages
ˆ Does not need the random variable to be non-negative. Most
often, gives better bounds than Markov.
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 7 / 20
Statement of Chebyshev’s Inequality

Chebyshev’s Inequality
Let X be random variable with mean µ and standard
deviation σ. Then for any positive real number k,
1
P[|X − µ| ≥ kσ] ≤ ,
k2
or
1
P[|X − µ| < kσ] ≥ 1 − .
k2

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 8 / 20
Example

Example 2
Let X be a random variable with mean 11 and variance 9.
Use Chebyshev’s inequality to find a lower bound to
P[6 < X < 16].

P[6 < X < 16] = P[−5 < X −11 < 5] = P[|X −E [X ]| < 5]
9 16
= ≥1−
25 25
Using Chebyshev’s Inequality

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 9 / 20
Problem 1

Problem 1
Let X be a Poisson(λ) random variable. Show that,
λ−1
P(0 < X < 2λ) ≥
λ

Proof.
P(0 < X < 2λ) = P(−λ < X − λ < λ) = P(|X − λ| < λ)
Var (X ) λ λ−1
≥1− = 1 − =
λ2 λ2 λ

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 10 / 20
Problem 2

Problem 2
A biased coin is flipped 100 consecutive times. Let X denote the proportion of
heads. Find a lower bound to the probability that

P[p − 0.1 < X < p + 0.1]

where p denotes the unknown probability of heads.

Solution.
p(1 − p) 1
Here 100 · X ∼ Bin(100, p), thus E (X ) = p and V (X ) = ≤ . The
100 400
Chebyshev’s theorem gives us,

V (x)
P[p − 0.1 < X < p + 0.1] = P[|X − p| < 0.1] ≥ 1 −
0.12
1 1 3
≥1− =1− =
400 × 0.01 4 4
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 11 / 20
Problem 3

Problem 3
How many times would you flip a biased coin with
probability of head p in order to achieve that,

P[p − 0.01 < X < p + 0.01] ≥ 0.9

where X denotes the proportion of heads in the executed


trials.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 12 / 20
Solution of Problem 3

Sol. of Problem 3
Let the number of trials be n.
p(1 − p) 1
Here n · X ∼ Bin(n, p), thus E [X ] = p, V [X ] = ≤ .
n 4n
The Chebyshev’s theorem gives us,
V (X )
P[p − 0.01 < X < p + 0.01] = P[|X − p| < 0.01] ≥ 1 −
0.012
1 10000
≥1− =1−
4n × 0.0001 4n
If 1 − 2500
n
= 0.9, n = 25000. Thus, we need at least 25000 trials to
confirm the given condition.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 13 / 20
Exercises

Exercise 1.
A company produces X bulbs per day. If E [X ] = 1000
and Var (X ) = 100, then what can you say about
P[900 < X < 1100]?

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 14 / 20
Normal approximation to binomial distribution

Normal approximation to binomial distribution.


Let X be binomial with parameters n and p. For large
n, X is approximately normal with mean np and
variance np(1 − p).
For most practical purposes the approximation is
acceptable for values of n and p such that either
p ≤ 0.5 and np > 5 orp > 0.5 and n(1 − p) > 5.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 15 / 20
Figure: Normal approximation to binomial.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 16 / 20
Normal approximation to binomial distribution

Let X be binomial random variable with parameter n and


p to which normal approximation is acceptable. Suppose
that X0 is the normal random variable with mean np and
variance np(1 − p). We use the following approximations.
For any non negative integer ℓ and k,

P(X = k) ≈ P(k − 0.5 ≤ X0 ≤ k + 0.5)


P(X ≤ k) ≈ P(X0 ≤ k + 0.5)
P(X ≥ ℓ) ≈ P(X0 ≥ ℓ − 0.5)
P(ℓ ≤ X ≤ k) ≈ P(ℓ − 0.5 ≤ X0 ≤ k + 0.5)

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 17 / 20
Normal approximation to binomial distribution

Problem 1
If a random variable has the binomial distribution with
n = 25 and p = 0.65, use the normal approximation to
determine the probabilities that it will take on
(a) the value 15;
(b) a value less than 10.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 18 / 20
Normal approximation to binomial distribution

Problem 2
From past experience, a company knows that, on average,
5% of their concrete does not meet standards. Use the
normal approximation of the binomial distribution to
determine the probability that among 2000 bags of
concrete, 75 bags contain concrete that does not meet
standards.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 19 / 20
Thank you for your attention

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics February 20, 2025 20 / 20

You might also like