Selected Probability Models: Bernoulli Trials - Most Common Type of Situation Modeled

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Selected Probability Models

Bernoulli trials - most common type of situation modeled:


independently repeated, two possible outcomes (S or F),
probability of success (p) is the same from trial to trial.
Bernoulli distribution P(X = 1) = p = 1 P(X = 0).
E(X ) = p, Var(X ) = p(1 p), mX (t) = E(etX )
= e0t (1 p) + et1 p = 1 p + pet .
Binomial distribution, BIN(n, p) n Bernoulli trials:
Sn = X1 + X2 + + Xn , Xi is a Bernoulli trial, i = 1, 2, . . . , n.
Sn is a number of successes in n Bernoulli trials.
P
E(Sn ) = E(Xi ) = np,
P
Var(Sn ) = Var(X1 + + Xn ) = Var(Xi ) = np(1 p).

P(Sn = k ) = kn pk (1 p)nk , k = 0, 1 . . . , n.

P
mSn (t) = n0 etSn Snn pSn (1 p)nSn

P
= n0 (et p)Sn (1 p)nSn Snn = (pet + 1 p)n .
1/1

Geometric distribution, GEO(p) X is the number of failures


before the first success. X = 0, 1, 2, . . .

X
P
x
E(X ) = x=0 p(1 p) x = p
xq x = p
|x=1{z }


 = q + 2q 2 + 3q 3 + 4q 4 + = (q + q 2 + q 3 + q 4 + )
+(q 2 + q 3 + q 4 + )
+(q 3 + q 4 + )
+(q 4 + )

q
q2
q3
q
2
3
1
= 1q + 1q + 1q + = 1q (q + q + q + ) = (1q)
2 =
|
{z
}

q
.
p2

q
1q

Finally, E(X ) = p pq2 = qp .

2/1

Var (X ) = E(X 2 ) (E(X ))2 , E(X 2 ) =

x=1 p(1

p)x x 2 =?

mX (t) = E(etX ) = e0t p + e1t pq + e2t pq 2 + e3t pq 3 +


= 1p + et pq + e2t pq 2 + e3t pq 3 + = p(1 + et q + (et q)2 + )
1
t 1
= p 1e
t q = p(1 qe )
0

m (t) = (1)p(1 qet )2 (qet ) = pqet (1 qet )2 ,


0

m1 = m (0) = pq(1 q)2 = pqp2 =

pq
p2

q
p

00

d
t
t 2 = pq[et (1 qet )2
dt pqe (1 qe )
+et (2)(qet )(1 qet )3 ] = pqet (1 qet )3 [1

m (t) =

qet + 2qet ]

= pqet (1 + qet )(1 qet )3 .


00

m2 = m (0) = pq(1 + q)(1 q)3 =


Now Var(X ) = m2 m12 =

q(1+q)
p2

pq(1+q)
p3

( qp )2 =

q(1+q)
.
p2

q+q 2 q 2
p2

q
.
p2

3/1

S(x) = P(X > x) = P(X x + 1) = P(X = x + 1) + P(X =


x + 2) + = pq x+1 + pq x+2 + pq x+3 +
1
= pq x+1 (1 + q + q 2 + ) = pq x+1 1q
=

pq x+1
p

= q x+1 .

Therefore F (x) = 1 S(x) = 1 q x+1 .


Problem 9.1.9 p. 269 X number of failures, U number of
trials including the first success. X + 1 = U.
P(S) = ( 16 )6 6 =

1
65

= p and q = 1 p = 1

E(U) = 1 + E(X ) = 1 +

q
p

= 1 + 65 (1

Var(U) = Var(X +1) = Var(X ) =

q
p2

1
)
65

1
.
65

= 1 + 65 1 = 65 .

= (65 )2 (1 615 ) = 65 (65 1).

HW 9.1.12, p. 270

4/1

Negative Binomial distribution generalization of a


geometric distribution waiting for r successes:
n

}|
{
z
S
| F F S F F S{z. . . S S S F S} S last (nth) result is a success.
n1 trials, r 1 successes
Let k be the number of failures, k = 0, 1, . . ..


1 r 1 k
k +r 1 r k
P(X = k ; p, r ) = k +r
p
q
p
=
r 1
r 1 p q ,
X = X1 + X2 + + Xr , Xi GEO(p).
E(X ) = E(X1 + + Xr ) = rE(Xi ) =

rq
p.

Xi s are independent,

so that Var(X ) = Var(X1 + + Xr ) = r Var(Xi ) =

rq
.
p2

Also, mX (t) = [mXi (t)]r = pr (1 qet )r

5/1

Example

Suppose that an ordinary die is rolled repeatedly.

(i) Find the probability that the third 6 occurs on the 7th roll.
(ii) Find the probability that the number of rolls until the first 6
occurs is at most nine.
(i) r = 3, p = 16 , X NB(3, 16 ), X is the number of failures.

4
P(X + 3 = 7) = P(X = 4) = 62 ( 61 )3 ( 56 )4 = 15 657 = 0.0335.
(ii) X GEO(p), X is the number of failures.
P(X + 1 9) = P(X 8) = FX (8) = 1 q 9 = 1 ( 56 )9 = 0.806.

6/1

Hypergeometric distribution N is a population size (a


successes and N a failures). Sample of size n is selected
and X is the number of successes in a sample of size n.
Range of X : X a and X n so that X min(a, n).
Also, n X N a so that X n + a N and X 0,
therefore X max(0, n + a N).
Finally, max(0, n (N a)) X min(a, n).
N n
a
a
E(X ) = an
,
Var(X
)
=
n

(1

)
(
).
N
N
N
N

1
| {z }
fpc

fpc - finite population correction factor.

7/1

Example Nine chips are in a bin five red and four blue. Six
chips are to be selected at random.
Obtain the distribution of X the number of red chips selected.
N = 5 + 4 = 9, a = 5, N a = 4, n = 6,
max(0, n (N a)) X min(a, n)
max(0, 6 (9 5)) X min(5, 6)
2X 5
(52)(44)
P(X = 2) = 9 =
(6)
5 4
( )( )
P(X = 4) = 4 9 2 =
(6)
X
p

(53)(43)
P(X = 3) = 9 =
(6)
5 4
(5)(1)
30
,
P(X
=
5)
=
=
84
(96)
10
84 ,

5
42

20
42

15
42

2
42
20
42

5
E(X ) = 2 42
+3
10
E(X ) = na
N = 3 .
Var(X ) = n Na (1

+4

a Nn
N )( N1 )

15
42

+5

2
42

40
84
4
84

10
3 ,

or

5
= 6 59 (1 59 )( 96
91 ) = 9 .
8/1

Poisson distribution Random variable X has a Poisson


n
distribution, POI(), if P(X = n) = n! e , > 0, n = 0, 1, 2, . . .
mX (t) =

E(etX )

tk k e
e
k =0
k!

t
e ee

X
(et )k
k =0

k!
{z

=1

ee

t
e(e 1) .

=
0
t
t
0
mX (t) = e(e 1) et = et+(e 1) , m1 = m (0) = e0 e0 = .
00

mX = [et+(e 1) ] = et+(e 1) (t + (et 1))


t
= et+(e 1) (1 + et ).
00

m2 = m (0) = e0+(e

0 1)

(1 + e0 ) = (1 + )

Var(X ) = m2 m12 = (1 + ) 2 = .

9/1

Poisson distribution is the only discrete distribution that has the


mean and the variance equal.
E(X ) = - interpretation of parameter is the average count
per unit (intensity).
Theorem If random variables X POI(1 ) and Y POI(2 )
are independent then variable X + Y has POI(1 + 2 )
distribution.
t

Proof: mX (t) = e1 (e 1) , mY (t) = e2 (e 1) , and


t
t
t
mX +Y (t) = mX (t)mY (t) = e1 (e 1) e2 (e 1) = e(1 +2 )(e 1) .
mX +Y (t) is a moment generating function of POI(1 + 2 )
distribution.

10 / 1

Problem 9.3.3 p. 288

= 1.5.

(i) Let X5 be the total number of accidents in five consecutive


months, X5 POI(5 1.5 = 7.5).
P(X5 = 5) =

e7.5 (7.5)5
5!

= 0.1094

(ii) Let X1 be the total number of accidents in one month,


[P(X1 = 1)]5 = ( e

1.5 1.5

1!

)5 = 0.0042.

11 / 1

Poisson approximation to binomial distribution


Theorem If p 0 and n in
 such
 a way that
k
n k
nk
=
e
limn np = > 0 then limn
p (1 p)
k
!
k
| {z }
|
{z
}
POI(=np)
BIN(n,p)
for k = 0, 1, . . . , n large, p small.
If p 1, successes and failures need to be relabeled.
Problem 9.3.2 p. 288 n = 500, = 0.3 (per page), Y is the
number of misprints per page,
S page contains at least three misprints. Y POI(0.3).
P
j
0.3 =
P(S) = 1 P(Y < 3) = 1 2j=0 0.3
j! e
0

0.3
0.3
0.3 (1+0.3+0.045) = 0.0036.
1e0.3 ( 0.3
0! + 1! + 2! ) = 1e

X500 BIN(500, 0.0036), = 500 0.0036 = 1.8, V POI(1.8).


P(X500 > 1) = 1 P(X500 1) 1 [P(V = 0) + P(V = 1)]
= 1 e1.8 (1 + 1.8) = 1 e1.8 2.8 = 0.5371.
12 / 1

You might also like