0% found this document useful (0 votes)
164 views54 pages

Mathematical Expectation

1) The moment generating function (MGF) of a random variable X, denoted MX(t), is the expected value of etX and provides information about the moments of X. 2) Some key properties of the MGF include: MX(0) = 1, the kth derivative of MX(t) evaluated at 0 is equal to the kth moment of X, and the MGF can be expressed as a Taylor series involving the moments of X. 3) The moments of a random variable, such as the mean, variance, and higher order central moments, contain useful information about the distribution and can be determined from the derivatives of the MGF.

Uploaded by

J A
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
164 views54 pages

Mathematical Expectation

1) The moment generating function (MGF) of a random variable X, denoted MX(t), is the expected value of etX and provides information about the moments of X. 2) Some key properties of the MGF include: MX(0) = 1, the kth derivative of MX(t) evaluated at 0 is equal to the kth moment of X, and the MGF can be expressed as a Taylor series involving the moments of X. 3) The moments of a random variable, such as the mean, variance, and higher order central moments, contain useful information about the distribution and can be determined from the derivatives of the MGF.

Uploaded by

J A
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 54

Mathematical Expectation

Mean of a Random Variable:

Definition:
Let X be a random variable with a probability
distribution 𝑓(𝑥) . The mean (or expected
value) of X is denoted by 𝜇𝑋 (or E(X)) and is
defined by:

2
Example:
A shipment of 8 similar microcomputers to a retail
outlet contains 3 that are defective and 5 are non-
defective. If a school makes a random purchase of
2 of these computers, find the expected number of
defective computers purchased
Solution:
Let X = the number of defective computers
purchased. we found that the probability
distribution of X is:

3
4
The expected value of the number of defective
computers purchased is the mean (or the expected
value) of X, which is:
2

𝐸 𝑋 = 𝜇𝑋 = 𝑥. 𝑓(𝑥)
𝑥=0
= 0. 𝑓 0 + 1. 𝑓 1 + 2. 𝑓 2

5
Example
Let X be a continuous random variable that
represents the life (in hours) of a certain electronic
device. The pdf of X is given by:

Find the expected life of this type of devices.

6
Solution:

hours
7
Theorem
Let X be a random variable with a probability
distribution 𝑓(𝑥), and let 𝑔(𝑋) be a function of the
random variable 𝑋. The mean (or expected value)
of the random variable 𝑔(𝑋) is denoted by 𝜇𝑔(𝑋) (or
𝐸[𝑔(𝑋)]) and is defined by:

8
Example:
Let X be a discrete random variable with the
following probability distribution

Find 𝐸[𝑔(𝑋)], where 𝑔(𝑋) = (𝑋 − 1)2.

9
Solution:

10
Example
Let X be a continuous random variable that
represents the life (in hours) of a certain electronic
device. The pdf of X is given by:

1
Find 𝐸
𝑥
11
Solution:

12
Variance (of a Random Variable)
The most important measure of variability of a
random variable X is called the variance of X and
is denoted by 𝑉𝑎𝑟(𝑋) or 𝜎 2

13
Definition
Let X be a random variable with a probability
distribution 𝑓(𝑥) and mean μ. The variance of X is
defined by:

14
Definition: (standard deviation)

15
Theorem

16
Example
Let X be a discrete random variable with the
following probability distribution

17
Solution:

18
19
20
Example

21
Solution:

22
Means and Variances of Linear Combinations of
Random Variables

23
Theorem
If X is a random variable with mean 𝜇 = 𝐸(𝑋), and
if a and b are constants, then:
𝐸(𝑎𝑋 ± 𝑏) = 𝑎 𝐸(𝑋) ± 𝑏 ⇔ 𝜇𝑎𝑋±𝑏 = 𝑎𝜇𝑋 ± 𝑏

Corollary 1: E(b) = b (a=0 in Theorem)

Corollary 2: E(aX) = a E(X) (b=0 in Theorem)

24
Example
Let X be a random variable with the following
probability density function:

Find E(4X+3).

25
Solution:

Another solution:

26
Theorem:

27
Corollary:

If X, and Y are random variables, then:

𝐸(𝑋 ± 𝑌) = 𝐸(𝑋) ± 𝐸(𝑌)

28
Theorem
If X is a random variable with variance
𝑉𝑎𝑟 𝑥 = 𝜎𝑥2
and if a and b are constants, then:

29
Theorem:

30
Corollary:
If X, and Y are independent random variables, then:

• Var(aX+bY) = a2 Var(X) + b2 Var (Y)

• Var(aX−bY) = a2 Var(X) + b2 Var (Y)

• Var(X ± Y) = Var(X) + Var (Y)

31
Example:
Let X, and Y be two independent random variables
such that E(X)=2, Var(X)=4, E(Y)=7, and
Var(Y)=1. Find:
1. E(3X+7) and Var(3X+7)
2. E(5X+2Y−2) and Var(5X+2Y−2).

32
Solution:
1. E(3X+7) = 3E(X)+7 = 3(2)+7 = 13
Var(3X+7)= (3)2 Var(X)=(3)2 (4) =36
2. E(5X+2Y−2)= 5E(X) + 2E(Y) −2= (5)(2) +
(2)(7) − 2= 22
Var(5X+2Y−2)= Var(5X+2Y)= 52 Var(X) + 22
Var(Y) = (25)(4)+(4)(1) = 104

33
Chebyshev's Theorem
Suppose that X is any random variable with mean
𝐸(𝑋) = 𝜇 and variance 𝑉𝑎𝑟(𝑋) = 𝜎 2 and standard
deviation 𝜎.
Chebyshev's Theorem gives a conservative estimate
of the probability that the random variable X
assumes a value within k standard deviations (𝑘𝜎)
of its mean μ, which is
𝑃(𝜇 − 𝑘𝜎 < 𝑋 < 𝜇 + 𝑘𝜎).
1
𝑃(𝜇 − 𝑘𝜎 < 𝑋 < 𝜇 + 𝑘𝜎) ≈ 1 − 2
𝑘
34
35
Theorem

Let X be a random variable with mean 𝐸(𝑋)


= 𝜇 and variance 𝑉𝑎𝑟(𝑋) = 𝜎2, then for 𝑘 > 1, we
have:

36
Example
Let X be a random variable having an unknown
distribution with mean μ=8 and variance σ2=9
(standard deviation σ=3). Find the following
probability:
(a) P(−4 <X< 20)
(b) P(|X−8| ≥ 6)

37
Solution:
(a) P(−4 <X< 20)= ??

(−4 < 𝑋 < 20) = (𝜇 − 𝑘𝜎 < 𝑋 < 𝜇 + 𝑘𝜎)

38
or

39
40
41
Another solution for part (b):

P(|X−8| < 6) = P(−6 <X−8 < 6)


= P(−6 +8<X < 6+8)
= P(2<X < 14)
(2<X <14) = (μ− kσ <X< μ +kσ)
2= μ− kσ ⇔ 2= 8− k(3) ⇔ 2= 8− 3k ⇔ 3k=6 ⇔ k=2

42
43
Moments of Random
Variables
The kth moment of X.

k  E  X k 

  xk p  x  if X is discrete
 x
 
  x k f  x  dx if X is continuous
-
the kth central moment of X

  E X   

0 k
k
 
   x   k p  x  if X is discrete
 x
 
   x   k f  x  dx if X is continuous
-

where  = 1 = E(X) = the first moment of X .


Moment Generating function of a R.V. X

  etx p  x  if X is discrete
 x
mX  t   E e    
tX

  etx f  x  dx if X is continuous


 k 2 3 k
u u u u
e    1 u   
u
 
k 0 k ! 2! 3! k!
Properties of
Moment Generating Functions
1. mX(0) = 1

mX  t   E  etX  , hence mX  0   E  e0 X   E 1  1


2 3 k
2. mX  t   1  1t  t 
2
t 
3
 t 
k

2! 3! k!
We use the expansion of the exponential function:
2 3
u u uk
eu  1  u     
2! 3! k!
 
mX  t   E etX
 t2 2 t3 3 tk k 
 E 1  tX  X  X   X  
 2!
2
3!
3
k!
k

t
  t
 1  tE  X   E X  E X   E X k 
2!
2

3!
3
 
t
k!
 
t2 t3 tk
 1  t 1  2  3   k 
2! 3! k!
k
d
3. mX  0   k mX  t    k
k 
dt t 0

Now 2 2 3 3 k k
mX  t   1  1t  t  t   t 
2! 3! k!
2 3 2 k k 1
mX  t   1  2t  3t   kt 
2! 3! k!
3 2 k k 1
 1  2t  t   t 
2!  k  1!
and mX  0   1
4 k
mX  t   2  3t  t   t k 2 
2!  k  2 !
and mX  0   2
continuing we find mX k 
 0   k
If k is odd: k = 0.

2 k 1
For even 2k:  k
 2k  ! 2 k !
or 2 k 
 2k  !
k
2 k!

2! 4!
Thus 1  0, 2   1, 3  0, 4  2 3
2 2  2!

You might also like