0% found this document useful (0 votes)
70 views22 pages

Review1 PDF

This document reviews key concepts in probability and statistics including: - Discrete and continuous random variables - Probability mass functions (p.m.f.) and probability density functions (p.d.f.) - Cumulative distribution functions (c.d.f.) - Expected value and how it is calculated for discrete and continuous variables - Variance and how it is calculated for discrete and continuous variables - Moment-generating functions (MGF) and how they are defined It also provides examples of calculating these metrics for both discrete and continuous distributions.

Uploaded by

d
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views22 pages

Review1 PDF

This document reviews key concepts in probability and statistics including: - Discrete and continuous random variables - Probability mass functions (p.m.f.) and probability density functions (p.d.f.) - Cumulative distribution functions (c.d.f.) - Expected value and how it is calculated for discrete and continuous variables - Variance and how it is calculated for discrete and continuous variables - Moment-generating functions (MGF) and how they are defined It also provides examples of calculating these metrics for both discrete and continuous distributions.

Uploaded by

d
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

STAT 410

Fall 2016

Review
random variables

discrete

continuous

probability mass function


p.m.f.

probability density function


p.d.f.

p( x ) = P ( X = x )

f( x )

0 p( x ) 1

f( x ) 0

p( x ) = 1

f (x ) d x

all x

=1

cumulative distribution function


c.d.f.
F( x ) = P( X x)
F( x ) =

p( y )

F( x ) =

y x

f (y) d y

expected value
E( X ) = X
discrete

continuous

If

x p ( x)

< ,

x p ( x)

E( X ) =

all x

x f ( x) d x

discrete

continuous

If

g ( x) p ( x)

< ,

all x

E(g(X ) ) =

g ( x) p ( x)

all x

< ,

all x

E( X ) =

x f ( x) d x

If

If

g ( x) f ( x) d x

< ,

E(g(X ) ) =

g ( x) f ( x) d x

variance
Var ( X ) = 2X = E ( [ X X ] 2 ) = E ( X 2 ) [ E ( X ) ] 2

discrete
Var ( X ) =

continuous

( x X ) p ( x)
2

( x X )

Var ( X ) =

all x

f ( x) dx

x 2 p ( x)

[E ( X ) ] 2

all x

= x 2 f ( x) dx [ E ( X ) ] 2

moment-generating function
MX( t ) = E( e t X )

discrete
MX(t ) =

continuous

e t x p( x )

MX(t ) =

t x f ( x ) dx

all x

Example 1:

p( x )

F( x )

1
2
3
4

0.2
0.4
0.3
0.1

0.2
0.6
0.9
1.0

F( x ) =

0
0.2
0.6
0.9
1

x <1
1 x < 2
2 x<3
3 x<4
x4

p( x )

x p( x )

1
2
3
4

0.2
0.4
0.3
0.1

0.2
0.8
0.9
0.4

E ( X ) = X = 2.3.

2.3

p( x )

x 2 p( x )

1
2
3
4

0.2
0.4
0.3
0.1

0.2
1.6
2.7
1.6

E ( X 2 ) = 6.1
Var ( X ) = 6.1 2.3 2 = 0.81

6.1

M X ( t ) = E ( e t X ) = 0.2 e t + 0.4 e 2 t + 0.3 e 3 t + 0.1 e 4 t

Example 2:
Let X be a continuous random variable
with the probability density function

a)

f ( x ) = k x 2,

0 < x < 1,

f ( x ) = 0,

otherwise.

What must the value of k be so that

f ( x ) is a probability density function?

1)

f (x) 0,

2)

f ( x ) dx = 1 .

f ( x ) dx

kx

b)

dx

k x 2 dx

x3 1

0
3

1
k

k
3

k = 3.

Find the cumulative distribution function F ( x ) = P ( X x ).

f X ( x ) = 3 x

0 < x <1

x<0

F X ( x ) = 0.

0x<1

FX( x ) =

o.w.

3y

x1
fX( x )

c)

FX( x )

Find the probability P( 0.4 X 0.8 ).

P( 0.4 X 0.8 ) =

0. 8

f (x ) dx

0. 4
0.8

3 x

0.4

dx = x 3

0.8
0.4

= 0.8 3 0.4 3 = 0.448.

F X ( x ) = 1.

d y = x 3.

OR
P( 0.4 X 0.8 ) = F X ( 0.8 ) F X ( 0.4- ) = 0.8 3 0.4 3 = 0.448.

d)

Find the median of the distribution of X.


m
Need m = ? such that

( Area to the left of m ) =

f ( x ) dx =

1
=
2

m=

e)

f ( x ) dx =

3 x

dx = x 3

m
0

1
.
2

= m3.

= 0.7937.

Find X = E ( X ).

E ( X ) = X =

x f ( x ) dx = x ( 3 x

)dx = 3 x 3 dx .
1

x4 1
3

= 3
= 0.75.
=
4 0
4

f)

Find X = SD ( X ).

Var ( X ) = X =
2

x 2 f ( x) dx ( ) 2 =
X

1
3 x 4 dx 3
4

x 5 1 3 2
3
3
9

= 3
= 0.0375.
=

=
5 0 4
80
5
16

X = SD ( X ) =

Var( X ) =

0.0375 = 0.19365.

g)

Find the moment-generating function of X, M X ( t ).

MX( t ) = E( e t X ) =

e t x f (x ) dx =

dv = e t x dx,

du = 6 x dx,

v =

MX(t ) =

t x 3 x 2 dx .

e t x.

1
1
e t x 3 x 2 dx = 3 x 2 e t x
t
0

3
= et

1 t x
6 x dx
e

1 t x
6 x dx

t e
0

u = 6 x,

dv = =

du = 6 dx,

v =

t
3
MX(t ) = e t

e
0

u = 3 x 2,

e t x dx,

e t x.

1 1
3 t
1
1 t x

t
x
6 x dx = e 6 x e
e
t
t

t2

0 0
0

t 2 e

t x 6 dx

6
1
3 t
6
3
6
6
6
e 2 e t + 3 e t x = e t 2 e t + 3 e t 3 ,
t
t
t
t
t
t
t
0

t 0.

M X ( 0 ) = 1.

h)

Find E (

X ) and E ( ln X ).
1

E(

X) =

x 3x 2 d x =

6
.
7

E ( ln X ) =

ln x 3 x

1
dx = .
3

Example 3:
6

f X ( x ) = 5 x

x<1

F X ( x ) = 0.

x1

FX( x ) =

x >1

o.w.

5y

dy

= y 5

fX( x )

x
1

= 1 x 5.

FX( x )

6
5
x 5x d x = 5x d x =

E( X ) = X =

E( X ) =

5x

Var ( X )

5
= 1.25.
4

5
d x = 5 x 4 d x = .
3
1

= E( X 2)

[ E( X) ]2 =

E ( X 10 ) does NOT exist since

10

5 5

3 4

5
.
48

5 x 6 d x diverges.

Median:

FX( m ) =

30th percentile:

1
.
2

F X ( 0.30 ) = 0.30.

1 m5 =

1
.
2

m=

2 1.1487.

1 ( 0.30 ) 5 = 0.30.

0.30 =

1
1.07394.
0.70

Example 4:

( Standard ) Cauchy distribution:

fX( x ) =

1+ x 2

),

< x < .

Even though f X ( x ) is symmetric about zero, E ( X ) is undefined since

1
dx = .
x
2

1
+
x

FX( x ) =

2
1 + y

dy =

arctan( x ) + ,
1
2

< x < .

P ( X < 1 ) = P ( 1 < X < 0 ) = P ( 0 < X < 1 ) = P ( X > 1 ) = 0.25.


M X ( t ) is undefined for all t 0.

M X ( 0 ) = 1.

Theorem 1:

M X ( t ) = M X ( t ) for some interval containing 0


1

Theorem 2:

M X' ( 0 ) = E ( X )

fX1( x ) = fX2( x )

M X" ( 0 ) = E ( X 2 )

(k)
MX ( 0) = E( X k)

Theorem 3:

Let Y = a X + b. Then M Y ( t ) = e b t M X ( a t )

Example 5:
Suppose a discrete random variable X has the following probability distribution:
P( X = 0 ) = p,
a)

P( X = k ) =

1
, k = 1, 2, 3,
k
2 k!

Find the value of p that would make this a valid probability distribution.

1
= 1.
k
k =1 2 k !

p+

Must have

Since

1
ak
= e 1 2 1.
= ea,

k
k =0 k !
k =1 2 k !

Therefore, p + ( e 1 2 1 ) = 1

b)

p = 2 e1 2.

and

Find E ( X ).

E(X) =

x p ( x)

= 0 2e

)+
k =1

12

all x

c)

2 k =1 2 k 1 ( k 1 )!

2 n=0

1
=
k
2 k k!

1
=
2 n n!

e1 2

k
k =1 2 ( k 1 )!
.

Find the variance of X, Var ( X ).

E(X(X 1)) =

k =2
=

1
=
k ( k 1 )
2 k k!
1

4 k = 2 2 k 2 ( k 2 )!

4 n=0

2 k ( k 2 )!
1

k =2

1
=
2 n n!

e1 2
4

E( X2) = E( X (X 1) ) + E( X) =

e1 2 .

Var ( X ) = E ( X 2 )

3
[E ( X )]2 = e1 2

e.

d)

Find the moment-generating function of X, M X ( t ).

MX(t ) =

e t x p ( x) = 1 2 e 1 2

all x

) + e t k k1
2 k!
k =1

t
e

(2 e1 2 ) +

k!

k =1

(2 e1 2 ) +

et 2

e
1

= 1 e1 2 + e e 2 .

e)

Use the moment-generating function of X, M X ( t ), to find E ( X ).


M X' ( t ) =

f)

ee

t
e 2 ,

E ( X ) = M X' (0 ) = e

12

Use the moment-generating function of X, M X ( t ), to find the variance of X, Var ( X ).


t
M '' ( t ) = e e
X

t
e 2

E ( X 2 ) = M X'' (0 ) =
Var ( X ) = E ( X 2 )

ee

t
e 2 ,

e1 2.

3
[E ( X )]2 = e1 2
4

1
4

e.

Example 6:
Let a > 2. Suppose a discrete random variable X has the following probability
distribution:

p ( 0 ) = P ( X = 0 ) = c,
1
,
k
a

p(k) = P( X = k ) =
a)

k = 1, 2, 3, .

Find the value of c ( c will depend on a ) that makes this is a valid probability
distribution.

Must have

p(x ) = 1.

1
= 1.
k =1 a k

c+

all x

bk

k =0

1
1 b

| b | < 1.

1
1
1
1
.
=
k k 1 = 1 1 =

1
1
a
a
k =1
k = 0

a
OR

1
1 1
1
1
1
.
=
=
=
k =1 a k a k = 0 a k a 1 1 a a 1

c+

b)

a 1

= 1.

c = 1

a
a2
.
= 2
a 1
a 1 a 1
1

Find P ( odd ).

P ( odd ) = p ( 1 ) + p ( 3 ) + p ( 5 ) + =

1
=

c)

first term
=
1 base

a
1

a
a 2 1

a2

Find the moment-generating function of X, M X ( t ). For which values of t does


it exist?

MX(t) = E(et X) = 1 c +

bk

k =0

1
1 b

k =1

1
.
ak

| b | < 1.

etk k =
a
k =1
k = 0

etk

t
e
a

k
1
et
a

1
1
1.
=

t
t aet

e
e
a

1
a

OR
t
tk 1 = e
e

a k a k =0
k =1

Need

et
a

MX(t) =

< 1.

et

k
=

et

a 1

1
et

et
aet

t < ln a.

et
a2
a
1
,
+
=

a 1 a e t a e t a 1

t < ln a.

a
aet

1.

d)

Find E ( X ).

E ( X ) = M 'X ( 0 ).

et
d a 2
M 'X ( t ) =
+
dt a 1 a e t

e t a e t e t e t

a e t

a e t
a e t

OR
M 'X ( t ) =

1
d a
a e t
a
e t =
.
=

2
2

dt a e t a 1
t

a e t
ae

E ( X ) = M 'X ( 0 ) =

( a 1 ) 2

OR
k 1

a 1
1
1
1
1 a

E(X) = k

E(Y ) ,
k
=
=
a 1
a k a a 1 k = 1 a
k =1
a

where Y is a Geometric random variable with probability of success

a 1
a

E(Y ) =

a
a 1

Therefore, E ( X ) =

( a 1 ) 2

OR

E ( X ) = 1

a
1

E(X) =

a
1

+ 3

a
1

+ 2

a
+ 3

+ 4

+ 5

1
5

a
+ 4

+ 6

1
6

a
+ 5

+ ...

+ ...

1
1
1
1
1
1
1
1
1
.
+
+
+
+
+
+ ... =
=
1 E ( X ) =
k
a

1
a
a1 a 2 a 3 a 4 a 5 a 6
k =1 a

Therefore, E ( X ) =

e)

+ 2

( a 1 ) 2

Find the cumulative distribution function F ( x ) = P ( X x ).

If x < 0,

F ( x ) = P ( X x ) = 0.

If k = 0, 1, 2, 3, ,

1
1
1
1
=
=

n
a k ( a 1 )
a k +1 1 1 a
n = k +1 a

1 F(k) = P(X > k) =

F(k) = 1

a k ( a 1 )

Since X is a discrete integer-valued random variable, if k x < k + 1,

F(x) = F(k) = 1

a k ( a 1 )

Therefore,

F(x) =

1
1 k
a ( a 1)

x<0
k x < k +1

k = 0, 1, 2, 3, ...

Example 7:
Let > 0. Suppose the probability density function of X is f X ( x ) = e

, x > 0.

( Exponential distribution. )
a)

Find the moment-generating function of X.

MX( t ) = E( e t X ) =

e t x f (x ) dx =

t x e x dx

x( t )

e
(
)
x
t
= e
=
dx =

b)

t < .

Use the moment-generating function of X to find E ( X ).

'

MX(t ) =

( t )2

t < .

1
E( X ) = MX(0 ) =
.

'

Example 8:
Let X be a discrete Binomial ( n, p ) random variable. That is, suppose the p.m.f.
of X is

n
p X ( k ) = p k ( 1 p ) n k ,
k

MX(t ) =

k =0

k = 0, 1, 2, , n.

e t k p k ( 1 p ) n k

k
n
p e t ( 1 p ) n k =
k

k =0

[(1 p) + p et]

Example 9:
Let X be a discrete Geometric ( p ) random variable. That is, suppose the
x 1 p, x = 1, 2, 3, .
probability mass function of X is p X ( x ) = ( 1 p )
a)

Find the moment-generating function of X.

MX(t ) =

e t k ( 1 p ) k 1 p = p e t

e t (k 1) (1 p ) k 1
k =1

k =1

= p e t

( 1 p ) e t

n=0

b)

n
=

p e t
,
1 (1 p ) e t

t < ln ( 1 p ).

Use the moment-generating function of X to find E ( X ).

p e t 1 (1 p ) e t

'

MX(t ) =

2
1 (1 p ) e t

'

t
t
p e (1 p ) e

(1 (1 p ) e t ) 2

p e t

E( X ) = MX(0 ) =

t < ln ( 1 p ).

p
1
=
.
( p )2
p

Example 10:
Let X be a random variable distributed uniformly over the interval [ a , b ].

MX( t ) = E( e t X ) =

b
t x f ( x ) dx =

et x b

b a t a
1

M X ( 0 ) = 1.

e
a

etb et a ,
t (b a )

tx

ba

t 0.

dx

Example 11:
Let X be a Poisson ( ) random variable. That is,
P(X = k) =
a)

k e
,
k!

k = 0, 1, 2, 3, .

Find the moment-generating function of X, M X ( t ).

k e
t
k
MX(t ) =
=
e
k
!
k =0

e e e

1.9.17 in the 7th edition

e (e

e t

k =0
1)

k!

( 1.9.16 in the 6th edition )

Let ( t ) = ln M ( t ), where M ( t ) is the m.g.f. of a distribution.


Prove that X' ( 0 ) = and X" ( 0 ) = . The function ( t ) is
2

called the cumulant generating function.

( ln M X ( t ) ) '
Since

M X' ( t )

( ln M X ( t ) ) "

MX (t )

M X ( 0 ) = 1,

M X ' ( 0 ) = E ( X ),

M X" ( t ) M X ( t ) M X' ( t )

[ MX (t ) ]

M X " ( 0 ) = E ( X 2 ),

X' ( 0 ) = ( ln M X ( t ) ) ' | t = 0 = E ( X ) = X
X" ( 0 ) = ( ln M X ( t ) ) " | t = 0 = E ( X 2 ) [ E ( X ) ] 2 = Var ( X ) = X2
b)

Find E ( X ) and Var ( X ).

ln M X ( t ) = ( e t 1 ).

( ln M X ( t ) ) ' = e t.

( ln M X ( t ) ) ' | t = 0 = E ( X ) = .

( ln M X ( t ) ) " = e t.

( ln M X ( t ) ) " | t = 0 = Var ( X ) = .

Example 12:
Let Y denote a random variable with probability density function given by
1
2

f(y) =
a)

< y < .

( double exponential p.d.f. )

Find the moment-generating function of Y. For which values of t does


it exist?

0
ty 1 y
MY(t ) = e e
dy =
2

1 0
=
2

ty 1 y
ty 1 y
dy
e e
dy + e e
2
2
0

ty 1 y
ty 1 y
dy
e dy + e e
2
2
0

y (t +1 )

1 y ( t 1 )
dy
dy + e
2 0

Note that the first integral converges only if t + 1 > 0,


and the second integral converges only if t 1 < 0.
Therefore, the moment-generating function is only defined for 1 < t < 1.
MY(t ) =
=

b)

y(t +1 ) 0
y ( t 1 )
1
1
1
1
+
=
e
e

0
2 ( t +1 )
2 ( t 1 )
2 ( t +1 ) 2 ( t 1 )

( t 1 ) ( t +1 )
2 ( t + 1 )( t 1 )

2 t 2 1

1
1 t 2

1 < t < 1.

Find E( Y ).
M Y' ( t ) = ( 1 t 2 ) 2 ( 2 t ) = 2 t ( 1 t 2 ) 2

E ( Y ) = M Y' ( 0 ) = 0.
OR

E( Y ) =

1
y 2 e

dy = 0,

since y

1 y
e
2

is an odd function.

c)

Find Var( Y ).

M Y" ( t )

2 2

= 2(1 t )

2 3

+ 2 t(2)(1 t )

E ( Y 2 ) = M Y" ( 0 ) = 2.

(2t) =

2 + 6t 2

(1 t 2 ) 3

Var( Y ) = 2 0 2 = 2.

OR
2

Var( Y ) = E ( Y ) [ E ( Y ) ] =

d)

2 1
y 2e

dy = = 2.

Find the cumulative distribution function F ( y ) = P ( Y y ).

If y < 0,

F(y) =

If y 0,

F(y) =

1 x
e
dx =
2

1 x
dx =
e
2

1 x
1 y
e dx = e .
2
2

1 x
e dx +
2

y
1 1
1 y
= 1 e
.
+ 1 e
2 2
2

Therefore,

1 y
e

F(y) =

1 y
1 e
2

y<0
y0

1 x
e dx
2

Example 13:
A simple model for describing mortality
in the general population in a particular
country is given by the probability density
function

f (y) =

252
10

18

y 6 ( 100 y ) 2 ,
0 < y < 100.

a)

Verify that f ( y ) is a valid probability


density function.
1.

f ( y ) 0 for each y;

2.

f ( y ) dy = 1 .

100

y ( 100 y ) dy = 252 x ( 1 x ) 2 dx

f ( y ) dy = 10 18
252

1 8 1 9 1
2
1 7
= 252 x 2 x + x = 252
= 1.
8
9
504
7
0

b)

Based on this model, which event is more likely


or

A: a person dies between the ages of 70 and 80


B: a person lives past age 80?

80

A:

252

10 18

70

y ( 100 y ) dy =
6

0.8

252 x

( 1 x ) 2 dx

0.7

1 8 1 9 0.8
1 7
= 252 x 2 x + x
8
9
7
0.7
0.7382 0.4628 = 0.2754.

100

B:

y ( 100 y ) dy =

252

10 18

1.0

252 x

( 1 x ) 2 dx

0.8

80

1 8 1 9 1 .0
1 7
= 252 x 2 x + x
8
9
7
0.8
1 0.7382 = 0.2618.

A is more likely.

c)

Given that a randomly selected individual just celebrated his 60th birthday, find
the probability that he will live past age 80.
100

P ( over 80 | over 60 ) =

252

10
P ( over 80 over 60 )
= 80
100
P ( over 60 )
252

60 10

d)

18

18

y 6 ( 100 y ) 2 dy
y 6 ( 100 y ) 2 dy

1 0.7382
0.2618
=
0.3408.
1 0.2318
0.7682

Find the value of y that maximizes f ( y ) ( mode ).

f '(y) =

5
6
252
6 y ( 100 y ) 2 2 y ( 100 y )

18
10

252
10

252
10

18

18

y ( 100 y ) [ 6 ( 100 y ) 2 y ]
5

y ( 100 y ) [ 600 8 y ] = 0.
5

y = 0, y = 100 (not max), y = 75 years (max).

e)

Find the (average) life expectancy.

100

y f ( y ) dy =

E( Y ) =

252

0 10

18

y ( 100 y ) dy = 252 100 x 7 ( 1 x ) 2 dx


2

1
1
2
1
1
= 252 100 x 8 2 x 9 + x 10 = 252 100
= 70 years.
9
10
720
8
0
OR
Y
.
100

Consider X =

Then Y = 100 X, and X has the probability density function

f ( x ) = 252 x 6 ( 1 x ) 2 , 0 < x < 1.


Then X has Beta distribution with = 7 and = 3.
E( X ) =

f)

7
= 0.70.
7+3

E ( Y ) = 100 E ( X ) = 70 years.

Find the standard deviation of the lifetimes.

Var ( X ) =

73
11 10 2

21
.
1100

Var ( Y ) = 100 2 Var ( X ) =


SD ( Y ) 13.817.
OR

E( Y ) =

f ( y ) dy =

100

252

0 10

18

y 8 ( 100 y ) 2 dy =

Var ( Y ) = E ( Y 2 ) [ E ( Y ) ] 2 =
SD ( Y ) =

Var (Y ) =

2100
.
11

You might also like