0% found this document useful (0 votes)
15 views37 pages

Lecture 15 N

Uploaded by

anahitadeylemi0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views37 pages

Lecture 15 N

Uploaded by

anahitadeylemi0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

Math 323 Lectures #15

1/37
Mathematical Expectation and Variance
Mathematical Expectation

Expected value of a random variable X is the measure of centrality of X.


It is the weighted mean or average of all possible values of the random
variable X.
Definition
Let X be a discrete random variable with the possible values
{x1 , x2 , x3 , ...} (finite or countably infinite). The expected value of X,
denoted by E (X ) is defined as

E (X ) = ∑ xk .P(X = xk ) = ∑ xk pX (x)

2/37
Mathematical Expectation
Mathematical Expectation and Variance

1. The Expected Value of X is know as the population mean of X


2. The Expected value is denoted by µX .
3. The Expected value is the weighted average of X, where the
weights are probabilities at the possible values of X.

3/37
Mathematical Expectation
Mathematical Expectation and Variance

4. The Expected value of X can be thought of as a long run average of


X.
5. The expected value of X is a theoretical average. It is rarely realized
in practice
6. The expected value of X exists if E |X | converges

4/37
Mathematical Expectation
Properties

When it exists, the mathematical expectation satisfies the following


properties: If a is a constant, then
1. E (a) = a
E (a) = ∑ apX (x)
= a ∑ pX (x)
= a.1 = a

5/37
Mathematical Expectation
Properties

2. If a is a constant then E (aX ) = aE (X )


3. The mathenatical expectation is linear. That is,

E (X1 + X2 + .... + Xn ) = E (X1 ) + E (X2 ) + ... + E (Xn ),

for any set of random variables X1 , X2 ...., Xn


Result: Let a1 and a2 be two constants then, when the mathematical
expectation exists, it satisfies the following property:

E (a1 X1 + a2 X2 ) = a1 E (X1 ) + a2 E (X2 )

6/37
Examples

Roll a fair die:

Toss a fair coin:

Toss a biased coin:

7/37
Expectation
Result

Consider the probability mass function f (x) = xc2 of the random variable
X, where x = 1, 2, .... Find the expected value of X.


E (x ) = ∑ xf (x)
x =1

c
= ∑ x. x2
x=1

c
= ∑x
x =1

1
=c∑
x=1 x

8/37
Expectation
Result

The series

1
∑x
x=1

is a divergent series. Therefore, the expected value of X does not exist.

9/37
Expectation
Example, textbook p. 97

An insurance company issues a one-year $1000 policy insuring against


an occurrence A that historically happens to 2 out of every 100 owners
of the policy. Administrative fees are $15 per policy and are not part of
the company’s “profit.” How much should the company charge for the
policy if it requires that the expected profit per policy be $50?

10/37
Expectation
Example

Let C is the premium for the policy, the company’s “profit” is C − 15 if


the event A does not occur and C − 15 − 1000 if A does occur. Let X be
a random variable that represents the company’s profit. Then,
X = C − 15 with probability 0.98 and X = C − 15 − 1000 with
probability 0.02.

11/37
Expectation
Example

Then,

E (X ) = (C − 15)(.98) + (C − 15 − 1000)(.02)
= 50
=⇒ C = $85.

12/37
Mathematical Expectation and Variance
Variance

The mean of a random variable is not sufficient to characterise it’s


distribution. We need a measure of spread as well. Hence, we define the
following.
Definition
The variance of a random variable X, with mean E (X ) = µX , is defined
as
Var (X ) = E [(X − µX )2 ] = E [(X − E (X ))2 ]

13/37
Variance

1. The variance of the random variable X is denoted by Var (X ) = σX2 .


2. The variance of a random variable X is the expected value of (X − µX )2
3. Therefore, large value of σX2 means that the values of X are spread far
from the mean.
4. A probability distribution can be completely specified by µX and σX2 , the
parameters of the distribution which are constant for a given population.
5. As (X − µX )2 ≥ 0, Var (X ) ≥ 0.
6. The units of Var (X ) are the same as that of X 2 . Therefore,
q we prefer to
use the positive square root of Var (X ). SD(X ) = σX = σX2
σX has the same units as of X.
7. Var (cX ) = c2 Var (X )

14/37
Variance
Result

Theorem
Var (X ) = E [X 2 ] − [E (X )]2

Proof:

Var (X ) = E [(X − µX )2 ]
= E [X 2 − 2µX X + µ2X ]
= E [X 2 ] − 2E [µX X ] + E [µ2X ]
by linearity of expectation.
= E [X 2 ] − 2E [X ]E [X ] + [E (X )]2
= E [X 2 ] − [E (X )]2

15/37
Variance
Fair die roll

We have seen that E (X ) = 3.5

1 1 1
E (X 2 ) = 1 × + 22 × + 32 ×
6 6 6
2 1 2 1 2 1
+4 × +5 × +6 ×
6 6 6
2 2 2
σ = Var (X ) = E [X ] − [E (X )]
 2
91 7
= −
6 2
≈ 2.92
q √
σX = σX2 ≈ 2.921 ≈ 1.71

16/37
Mathematical Expectation and Variance
More Examples

Let X be a random variable such that P(X = x) = n1 for x = 1, 2, ..n.


That is, X has uniform distribution on first n positive integers.
n+1
E (X ) =
2

17/37
Mathematical Expectation and Variance
More Examples

1 2
E [X 2 ] = (1 + 22 + .... + n2 )
n
1 n(n + 1)(2n + 1)
=
n 6
(n + 1)(2n + 1)
=
6
Var (X ) = E [X 2 ] − [E (X )]2
 2
(n + 1)(2n + 1) n+1
= −
6 2
n2 − 1
=
12

18/37
Moments of a distribution

Let X be a discrete random variable then the moments of X around the


origin are
E (X ): First moment of X about the origin
E (X 2 ): Second moment of X about the origin
.
.
E (X k ): kth moment of X about the origin
Where E (X r ) < ∞ for r = 1, 2, ..k

19/37
Moments of a distribution

1. Moments can be very useful in finding the mean and the variance of
a random variable X.
2. The mean of the random variable X, E (X ) is the first moment of X
about the origin.
3. The variance of the random variable X, E (X − µ)2 is the second
moment of X about the mean.

20/37
Moments of a distribution
Factorial Moments

Let X be a random variable. Then the rth factorial moment of X is

E (Xr ) = E [X (X − 1)....(X − (r − 1))]

E(X) : First factorial moment of X


E[X(X-1)]: Second factorial moment of X

21/37
Moments of a distribution
Factorial Moments

The powered ( classical) moments can be obtained from the factorial


moments by solving a system of linear equations. It is very simple for
the first few: E [X ] = E [X ],
E [X 2 ] = E [X (X − 1)] + E [X ],
We will use the classical moments and factorial moments to find the
mean and variance for some of the discrete distributions we discussed in
the previous few lectures.

22/37
Mean and Variance of the uniform Distribution

A random variable X is said to have a discrete uniform distribution if


pX (x) = P(X = x) = N1 for X = x1 , x2 ..xN Then

N
E (X ) = ∑ xk .P(X = xk )
k =1
N
= ∑ xk .pX (x)
k =1
N
1
=
N ∑ xk = x̄
k =1

23/37
Mean and Variance of the uniform Distribution

To find the variance we will use the computing formula


Var (X ) = E [X 2 ] − [E (X )]2 Then
N
E (X 2 ) = ∑ xk2 .P(X = xk )
k =1
N
= ∑ xk2 .pX (x)
k =1
N
1
=
N ∑ xk2
k =1
Var (X ) = E [X ] − [E (X )]2
2

N
1
=
N ∑ xk2 − x̄2
k =1

24/37
Mean and Variance of the Bernoulli Distribution

The probability distribution of the Bernoulli random variable X is

x P(X = x )
0 1−p
1 p

25/37
Mean and Variance of the Bernoulli Distribution

E (X ) = 0(1 − p) + 1(p) = p

E (X 2 ) = 02 (1 − p) + 12 (p) = p

Var (X ) = E [X 2 ] − [E (X )]2
= p − p2 = p ( 1 − p )

26/37
Mean and Variance of the Binomial Distribution

For a binomial random variable X


 
n x
P(X = x ) = p (1 − p )n−x
x

The mean of X is
n n  
n x
E (X ) = ∑ x.P(X = x) = ∑ x.
x
p (1 − p )n−x
x=0 x=0

27/37
Mean and Variance of the Binomial Distribution

n  
n x
E (X ) = ∑ x. p (1 − p )n−x
x=0 x
n
n!
= ∑ x. x!(n − x)! px (1 − p)n−x
x=0
n
n(n − 1) !
= ∑ ( x − 1 ) ! ( n − x ) !
px (1 − p)n−x
x=1

28/37
Mean and Variance of the Binomial Distribution

n
n(n − 1) !
= ∑ ( x − 1 ) ! (( n − 1) − (x − 1))!
ppx−1 (1 − p)((n−1)−(x−1))
x=1
n
n − 1 x−1
 
= np ∑ p (1 − p)((n−1)−(x−1))
x−1=0 x − 1
= np

29/37
Mean and Variance of the Binomial Distribution

Var (X ) = E (X 2 ) − [E (X )]2
Expected value of X 2 is E (X 2 ) = ∑nx=0 x2 .P(X = x)
Instead of using the previous method of simplification we will use E (X (X − 1))
to find E (X 2 ).

n
E (X (X − 1)) = ∑ x(x − 1).P(X = x)
x=0
n
= ∑ x(x − 1).P(X = x)
x=2

30/37
Mean and Variance of the Binomial Distribution

n
E (X (X − 1)) = ∑ x (x − 1).
x=0
n  
n x
= ∑ x (x − 1). p (1 − p )n−x
x=2 x
n
n!
= ∑ x(x − 1). x!(n − x)! px (1 − p)n−x
x=2
n
n(n − 1)(n − 2)! x
= ∑ x (x − 1). x!(n − x)!
p (1 − p )n−x
x=2

31/37
Mean and Variance of the Binomial Distribution

n
n(n − 1)(n − 2)!
= ∑ . (x − 2)!(n − x)! p2 p( x − 2)(1 − p)n−x
x=2
n
n(n − 1)(n − 2)!
= ∑ . (x − 2)!((n − 2) − (x − 2))! p2 p( x − 2)(1 − p)(n−2)−(x−2)
x=2
n
n − 2 x−2
 
= n(n − 1)p 2
∑ x−2
p (1 − p)((n−2)−(x−2))
x−2=0
= n(n − 1)p2

32/37
Mean and Variance of the Binomial Distribution

Var (X ) = E (X 2 ) − [E (X )]2
E (X (X − 1)) = E (X 2 − X ) = E (X 2 ) − E (X )
E (X 2 ) = E (X (X − 1)) − E (X )
= n(n − 1)p2 − np = n2 p2 − np2 + np
Var (X ) = n2 p2 − np2 + np − n2 p2
= np − np2 = np(1 − p)

33/37
Mean and Variance of the Poisson Distribution

Let X ∼ Poiss(λ). Then

λ x eλ
P(X = x ) = x = 0, 1, 2, ..
x!


λ x eλ
E (X ) = ∑x x!
x=0

λ x eλ
= ∑x x!
x=1

λ x eλ
= ∑
x=1 (x − 1 ) !

34/37
Mean and Variance of the Poisson Distribution


λ x−1 eλ
E (X ) = λ ∑
x−1=0 (x − 1 ) !

E (X ) = λ
Var (x) = E (X 2 ) − [E (X )]2

λ x eλ
E (X (X − 1)) = ∑ x (x − 1) x!
x=0

λ x eλ
= ∑ x (x − 1)
x!
x=2

35/37
Mean and Variance of the Poisson Distribution


λ x eλ
E (X (X − 1)) = ∑ x (x − 1) (x − 2) !
x=2

λ x−2 eλ
= λ2 ∑
x=2 (x − 2 ) !
= λ2

E (X (X − 1)) = E (X 2 ) − E (X )
= λ2
=⇒ E (X 2 ) = λ2 + λ

36/37
Mean and Variance of the Poisson Distribution

Therefore,

Var (x) = E (X 2 ) − [E (X )]2


= λ2 + λ − λ2

37/37

You might also like