Unit02 Slide
Unit02 Slide
STA3154
1 / 53
Random Variables of the Discrete Type
2 / 53
What we are going to learn in this chapter
3 / 53
Example 2.1.1
4 / 53
Example 2.1.2
5 / 53
Key questions
I Challenges
1. In many practical situations, the probabilities assigned to the
events are unknown.
2. Since there are many ways of defining a function X on S, which
function do we want to use?
I Solutions
1. We often need to estimate these probabilities or percentages
through repeated observations, or how to
“formulate/mathematize” the outcome.
I e.g. How do we define performance metric? What is the success
of a student?
2. Try to determine what measurement (or measurements) should
be taken on an outcome
I What percentage of newborn girls in NYC weigh less than 7
pounds?
6 / 53
Discrete random variable
7 / 53
Definition 2.1.2
(a) f (x ) > 0, x ∈ S 1
X
(b) f (x ) = 1
x ∈S
X
(c) P(X ∈ A) = f (x ), where A ⊂ S.
x ∈A
1
We let f (x ) = 0 for x ∈
/ S. We sometimes call S as the support of X .
8 / 53
Cumulative distribution function
9 / 53
Uniform distribution
I When a pmf is constant on the space or support, we say that
the distribution is uniform over that space.
I In Example 2.1.2, X has a discrete uniform distribution on S =
{1, 2, 3, 4, 5, 6} and its pmf is
1
f (x ) = , x = 1, 2, 3, 4, 5, 6.
6
I How does F (X ) look like?
1.0
0.8
0.6
F(x)
0.4
0.2
0.0
0 1 2 3 4 5 6
10 / 53
Example 2.1.3
Roll a fair four-sided die twice, and let X be the maximum of the
two outcomes.
I The outcome space for this experiment is S = {(d1 , d2 ) :
d1 = 1, 2, 3, 4;d2 = 1, 2, 3, 4}.
I Assume that each of these 16 points has probability 1/16.
I P(X = 1):
I P(X = 2):
I P(X = 3):
4
3
2
1
1 2 3 4
12 / 53
Mathematical expectation
13 / 53
Example 2.2.1
Consider a game to let the participant cast a fair die and then
receive a payment according to the following schedule:
I If the event A = {1, 2, 3} occurs, s/he receives one dollar
I If B = {4, 5} occurs, s/he receives two dollars
I If C = {6} occurs, s/he receives three dollars.
If X is a random variable that represents the payoff, then the pmf of
X is given by
f (x ) = (4 − x )/6, x = 1, 2, 3.
The average payment would be
3 2 1 5
(1) + (2) + (3) =
6 6 6 3
14 / 53
0.5
0.4
0.3
f(x)
0.2
0.1
0.0
1 2 3
x
15 / 53
Mathematical expectation
E (X )
16 / 53
Another function of X
3 2 1 10
X
E (Y ) = yg(y ) = (1) + (4) + (9) = .
y =1,4,9
6 6 6 3
17 / 53
Comparison
3 2 1 5
E (X ) = (1) + (2) + (3) =
6 6 6 3
3 2 1 10
E (Y ) = (1) + (4) + (9) =
6 6 6 3
18 / 53
Definition 2.2.1
19 / 53
Example 2.2.2
20 / 53
Theorem 2.2.1
E [cu(X )] = cE [u(X )]
.
c. If c1 and c2 are constants and u1 and u2 are functions, then
21 / 53
Example 2.2.4
22 / 53
Special Mathematical Expectations
23 / 53
Variance
24 / 53
Example 2.2.1 revisited
25 / 53
Alternative formula for variance
26 / 53
Example 2.3.1
Let X equal the number of spots on the side facing upward after a
fair six-sided die is rolled.
1
f (x ) = P(X = x ) = , x = 1, . . . , 6.
6
6
X 1 + 2 + ... + 6 21
E (X ) = µ = = xf (x ) = =
x =1
6 6
6
X 12 + 22 + . . . + 62 91
E (X 2 ) = x 2 f (x ) = =
x =1
6 6
2
91 21 35
h i
2 2
Var(X ) = E X −µ = − =
6 6 12
27 / 53
Some properties of expectation and variance
Let X be a random variable with mean µX and variance σX2 , and
define Y = aX + b, where a and b are constants.
I Y is a random variable
I The mean of Y is
E (Y ) = µY = E (aX + b) = aE (X ) + b = aµX + b
I The variance of Y is
h i
Var(Y ) = Var(aX + b) = E (aX + b − µY )2
h i
= E (aX + b − (aµX + b))2
h i
= E (aX − aµX )2
h i
= a2 E (X − µX )2
= a2 σX2 .
28 / 53
Example 2.3.3
Let X have a uniform distribution on the first m positive integers.
1
f (x ) = P(X = x ) = , x = 1, . . . , m.
m
m
X 1 + 2 + ... + m m+1
E (X ) = µ = xf (x ) = =
x =1
m 2
m
X 12 + 22 + . . . + m2 (m + 1)(2m + 1)
E (X 2 ) = x 2 f (x ) = =
x =1
m 6
2
(m + 1)(2m + 1) m+1
h i
Var(X ) = E X 2 − µ2 = −
6 2
m2 − 1
=
12
29 / 53
Definition 2.3.1
exists and is finite for −h < t < h, then the function defined by
M(t) = E (e tX )
30 / 53
Moment-generating function
I Two random variables have the same moment-generating
function if and only they must have the same distribution of
probability.
I Let M (m) (t) denote the m-th derivative of M(t) with regard to
t. Then, M (m) (0) = E (X m ).
X
M(t) = e tx f (x )
x ∈S
0
X
M (t) = xe tx f (x )
x ∈S
00
X
M (t) = x 2 e tx f (x )
x ∈S
Setting t = 0,
M 0 (0) =
X
xf (x ) = E (X )
x ∈S
M 00 (0) =
X
x 2 f (x ) = E (X 2 )
x ∈S
31 / 53
Example 2.3.7
Suppose X has the pmf
f (x ) = q x −1 p, x = 1, 2, 3, . . . , ; p + q = 1.
I The mgf of X is
∞
e tx q x −1 p
X X
e t xf (x ) =
x ∈S x =1
∞ ∞
p X pX
= e tx q x = (qe t )x
q x =1 q x =1
p t 1
= (qe ) + (qe t )2 + . . .
q
p qe t
= ,
q 1 − qe t
(1 − qe t ) (pe t ) − pe t (−qe t )
M 0 (t) =
(1 − qe t )2
pe t
=
(1 − qe t )2
(1 − qe t )2 (pe t ) − (pe t ) (2) (1 − qe t ) (−qe t )
M 00 (t) =
(1 − qe t )4
pe t (1 + qe t )
=
(1 − qe t )3
I Hence,
1 1+q
E (X ) = M 0 (0) = E (X 2 ) = M 00 (0) = ,
p p2
34 / 53
Introduction
36 / 53
Bernoulli distribution
f (x ) = p x (1 − p)1−x , x = 0, 1,
37 / 53
Properties of Bernoulli distribution
I The variance of X is
1
X
Var(X ) = x 2 p x (1 − p)1−x − [E (X )]2 = p − p 2 = p(1 − p).
x =0
38 / 53
Example 2.4.4
39 / 53
Binomial distribution
40 / 53
I The number of ways of selecting x positions for the x successes
in the n trials is !
n n!
= ,
x x !(n − x )!
and the probabilities of success and failure on each trial are,
respectively, p and q = 1 − p, the probability of each of these
ways is p x (1 − p)n−x .
I f (x ), the pmf of X , is the sum of the probabilities of the x
mutually exclusive events
!
n x
f (x ) = p (1 − p)n−x x = 0, . . . , n.
x
41 / 53
Summary of binomial experiment
42 / 53
Example 2.4.5
43 / 53
Example 2.4.8
Leghorn chickens are raised for laying eggs. Let p = 0.5 be the
probability that a newly hatched chick is a female. Assuming
independence, let X equal the number of female chicks out of 10
newly hatched chicks selected at random. Then the distribution of
X is b(10, 0.5). The probability of 5 or fewer female chicks is
44 / 53
Binomial expansion revisited
I If n is a positive integer,
n
!
n
X n x n−x
(a + b) = a b
x =0
x
45 / 53
mgf of binomial distribution
M(t) = E (e tX ) =
46 / 53
The Negative Binomial Distribution
47 / 53
Introduction
48 / 53
Example 2.5.1
P(X ≥ 4) =
P(X ≤ 4) =
49 / 53
The Poisson Distribution
50 / 53
Introduction
51 / 53
Definition
λx e −λ
f (x ) = , x = 0, 1, 2, . . . ,
x!
52 / 53
Example 2.6.4
P(X ≥ 5) =
53 / 53