0% found this document useful (0 votes)
29 views17 pages

STA 211 Lecture 2

The document discusses the expectation and variance of continuous random variables, defining key concepts such as probability density functions and symmetric probability functions. It provides examples and solutions for calculating expected values, variances, and distribution functions, along with an introduction to limit theorems, particularly Chebyshev's theorem. The document concludes with assignments and references for further reading on probability and distribution theories.

Uploaded by

Emmanuel Coker
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views17 pages

STA 211 Lecture 2

The document discusses the expectation and variance of continuous random variables, defining key concepts such as probability density functions and symmetric probability functions. It provides examples and solutions for calculating expected values, variances, and distribution functions, along with an introduction to limit theorems, particularly Chebyshev's theorem. The document concludes with assignments and references for further reading on probability and distribution theories.

Uploaded by

Emmanuel Coker
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Statistics

A Virtual Lecture Facilitated by

J. N. Onyeka-Ubaka (Ph.D)
[email protected] +2348059839937
Expectation and Variance of Continuous
Random Variables
Let X be a continuous random variable with probability
density function f(x). Suppose we divide the range X into
smaller intervals, each of length ∆x. the probability that x
falls in the interval (x, x + ∆x) is approximately f(x) ∆x.

▪ The expected value of X is therefore approximately


∑xf(x) ∆x.

▪ The limiting value of this as ∆x 0 is ∫xf(x)dx.


▪ This leads us to the following definition.
Definition: If X is a continuous random variable with
probability density function(x), then

E(X) = ‫׬‬−∞ 𝑥𝑓 𝑥 𝑑𝑥.

∞ 𝑟
▪ In general, E( 𝑋𝑟 ) = ‫׬‬−∞ 𝑥 𝑓 𝑥 𝑑𝑥
and

E(φ(X)) = ‫׬‬−∞ φ(X)𝑓 𝑥 𝑑𝑥.

▪ The variance σ2 of X is defined by



σ2 = E[(X – μ)2 ] = ‫׬‬−∞(x – μ)2 𝑓 𝑥 𝑑𝑥 = E(x 2 ) − μ2
where μ = E(X).
Example
Let X be a continuous random variable with probability density function

1
f(x) = 0<x<6
6
0 elsewhere

Find (a) F(x)


(b) E(X)
(c) E(X 2 )
(d) Var(X).
solution
𝑥1 𝑦𝑥 𝑥
(a) F(x) = ‫׬‬0 6 𝑑𝑦 =
6 0
=
6
= P(X ≤ 𝑥).

6
6 1 1 1 2 1 1
(b) E(X) = ‫׬‬0 𝑥 6 𝑑𝑥 = *
6 2
x 0 = * *36
6 2
=3

6
6 21 1 1 1 1
(c) E(X 2 ) = ‫׬‬0 x 6 𝑑𝑥 = * x3 0 = * *63 = 12
6 3 6 3

(d) var(X) = E(x 2 ) − μ2


= 12 – 9
= 3
Symmetric Probability Density Function
Definition 1: A probability density function f(x) is called a symmetric
probability function if f(-x) = f(x) for all x.
Example
1
1 − 𝑥2
✓ f(x) = 𝑒 2 , −∞ ≤ 𝑥 ≤ ∞ is a symmetric function.
2𝜋
1
✓ f(x) = ; -a < x < a is a symmetric probability density function.
2𝑎

Definition 2: A probability density function f(x) is said to be


symmetrical about a if f(a + x) = f(a – x) for all x.
If a = 0, then f(x) is symmetrical about 0.
Example
Prove that f(x) is symmetrical about the population mean,
μ.
Proof
1
1 −2(𝑥−μ)2
If f(x) = 𝑒 , −∞ ≤ 𝑥 ≤ ∞
2𝜋
then
1 1
1 − (μ+ 𝑥−μ)2 1 − 𝑥2
f(μ + x) = 𝑒 2 = 𝑒 2
2𝜋 2𝜋

1 1
1 −2(μ − 𝑥−μ)2 1 −2 𝑥 2
f(μ – x) = 𝑒 = 𝑒
2𝜋 2𝜋
hence, f(μ + x) = f(μ – x). Thus f(x) is symmetrical about μ.
Distribution Function

 Distribution function is also called cumulative distribution function


and is denoted by F(x).
Theorem: If f(x) is a symmetric probability density function, then
F(x) = 1 - F(-x).
Proof
−𝒙 −𝒙
F(-x) = ‫׬‬−∞ 𝒇 𝒚 𝒅𝒚 = ‫׬‬−∞ 𝒇 −𝒚 𝒅𝒚
since f(y) = f(-y).
let z = -y, then
𝒙 ∞ ∞ 𝒙
F(-x) = ‫𝒇 ∞׬‬ 𝒛 (𝟏)𝒅𝒛 = ‫𝒇 𝒙׬‬ 𝒛 𝒅𝒛 = ‫׬‬−∞ 𝒇 𝒛 𝒅𝒛 - ‫׬‬−∞ 𝒇 𝒛 𝒅𝒛 = 1- F(x)
and hence
F(x) = 1 – F(-x).
Assignment
1. If X has a normal distribution with mean 0 and variance 1. Find
(i) 𝑃(𝑋 ≤ 0.5) (ii) 𝑃(𝑋 ≥ 0.5) (iii) 𝑃(𝑋 ≤ −0.3)
(iv) 𝑃(0.3 ≤ 𝑋 ≤ 0.5) (v) 𝑃(−0.3 ≤ 𝑋 ≤ 0.5)
Sketch the areas under the curve in each case.

2. Suppose X is a continuous random variable with pdf


f(x) = λ𝑒 −λ𝑥 , 0 < 𝑥 < ∞. Find
(a) (i) the mean of X (ii) the variance of X.
(b) The probability distribution function F(x).
(c) The probability that X lies between 2 and 5.
NOTE: Solve and Submit when asked to do so.
Limit Theorems
 The most important theoretical results in probability are limit
theorems.
 Here we have the Chebyshev's inequality proposed by a Russian
mathematician Chebyshev which plays an important role in Probability
Theory. This inequality is then used to deduce the law of large
numbers for independent and identically distributed random variables.

Chebyshev’s Theorem: If μ and σ are mean and standard deviation of a


random variable X, then for any positive constant k, the probability is at
1
least 1 - 2 that X will take on a value within k standard deviations of
𝑘
1
the mean. That is, 𝑃(|X - μ| < kσ) ≥ 1 - 2 , 𝜎≠0
𝑘
Proof

𝜇 − 𝑘𝜎 𝜇 𝜇 + 𝑘𝜎
2 2 ∞
𝜎 = 𝐸[(𝑋 − 𝜇) ] = ‫׬‬−∞ 𝑓(𝑥 − 𝜇)2 𝑓(𝑥)𝑑𝑥
Then, dividing the integral into three parts as shown in the figure
above, we get
𝜇−𝑘𝜎 𝜇+𝑘𝜎
𝜎 2 = ‫׬‬−∞ 𝑓(𝑥 − 𝜇)2 𝑓(𝑥)𝑑𝑥 + ‫𝜇׬‬−𝑘𝜎 𝑓(𝑥 − 𝜇)2 𝑓(𝑥)𝑑𝑥 +

‫𝜇׬‬+𝑘𝜎 𝑓(𝑥 − 𝜇)2 𝑓(𝑥)𝑑𝑥
Since the integrand 𝑓(𝑥 − 𝜇)2 𝑓(𝑥) is non-negative, we can form the
inequality
𝜇−𝑘𝜎 ∞
𝜎 2 ≥ ‫׬‬−∞ 𝑓(𝑥 − 𝜇)2 𝑓(𝑥)𝑑𝑥 + ‫𝜇׬‬+𝑘𝜎 𝑓(𝑥 − 𝜇)2 𝑓(𝑥)𝑑𝑥
By deleting the second integral.
Prove of Chebychev’s Inequality Cont’d
Therefore, since (𝑥 − 𝜇)2 ≥ 𝑘 2 𝜎 2 for 𝑥 ≤ 𝜇 − 𝑘𝜎 or 𝑥 ≥ 𝜇 + 𝑘𝜎, it
follows that
2 𝜇−𝑘𝜎 2 2 ∞
𝜎 ≥ ‫׬‬−∞ 𝑘 𝜎 𝑓(𝑥)𝑑𝑥 + ‫𝜇׬‬+𝑘𝜎 𝑘 2 𝜎 2 𝑓(𝑥)𝑑𝑥
and hence that
1 𝜇−𝑘𝜎 ∞
≥ ‫׬‬−∞ 𝑓(𝑥)𝑑𝑥 + ‫𝜇׬‬+𝑘𝜎 𝑓(𝑥)𝑑𝑥, provided 𝜎2 ≠ 0 .
𝑘2
Since the sum of the two integrals on the right-hand side is the
probability that X will take on a value less than or equal to 𝜇 − 𝑘𝜎 or
greater than or equal to 𝜇 + 𝑘𝜎, we have thus shown that
1
𝑃(|X - μ| ≥ kσ) ≤
𝑘2
and it follows that
1
𝑃(|X - μ| < kσ) ≥ 1 − .
𝑘2
Example
1 3
The probability of at least 1 − = that a random variable X will take
22 4
on a value within two standard deviations of the mean.
1 24
The probability is at least 1 − = that it will take a value within
52 25
five standard deviations of the mean.
Remark: It is in this sense that 𝜎 controls the spread or dispersion of
the distribution of a random variable. Clearly, the probability given by
Chebyshev’s theorem is only a lower bound.
Remarks

For details Read Probability Results and


Limit Theorems Pages 218 -222.
Questions and Answers
References

 Onyeka-Ubaka, J. N. (2022). Probability and Distribution Theories for


Professional Competence. First Edition, Masterprint Educational
Publishers & Printers, Ibadan.

 Multi-Level Statistical Table


Thank you!

You might also like