Probability
Probability
= =
=
=
=
25 . 0 5 . 0 5 . 0
2 1
2 1 2 1
Joint Probability
Example
In a coin-toss experiment, what is probability of heads appearing
on two successive tosses?
Answer
0.5
0.5
P(H) = 0.5
P(T) = 0.5
0.25
0.25
P(H) = 0.5
P(T) = 0.5
0.25
0.25
P(H) = 0.5
P(T) = 0.5
Joint Probability
Example 2
Assume that an unfair coin is tossed which has P(H) = 0.9 and P(T)
= 0.1. What is the probability of getting three heads on three
successive tosses?
Answer
( ) ( ) ( ) ( )
( ) 729 . 0 9 . 0 9 . 0 9 . 0
3 2 1
3 2 1 3 2 1
= =
=
=
=
H H H P
thus t, independen ally statistic are events two the
toss second the on heads H
toss f irst the on heads H
where
H P H P H P H H H P
2
1
Joint Probability
Example 3
What is the probability of at least one head appear on two tosses of
a fair coin?
Answer
The possible ways a head may occur are: H
1
H
2
, H
1
T
2
, T
1
H
2;
each of
these has a probability of 0.25, hence
( ) ( ) ( ) ( )
( )
( )
( ) ( )
( ) 75 . 0 25 . 0 1 tosses 2 on appear head one least at
1 tosses 2 on appear head one least at
ely, alternativ
75 . 0 tosses 2 on appear head one least at
25 . 0 25 . 0 25 . 0 tosses 2 on appear head one least at
tosses 2 on appear head one least at
2 1
2 1 2 1 2 1
= =
=
=
+ + =
+ + =
P
T T P P
P
P
H T P T H P H H P P
Conditional Probabilities under Statistical
Independence
Conditional Probability
Symbolically, conditional probability is written P(B|A)
and is read
( ) occurred has A event that given B event of y Probabilit A B P
Conditional Probability
For statistically independent events, the conditional
probability of event B given that event A has occurred is the
same as the unconditional probability of event B;
symbolically,
( ) ( ) B P A B P =
Conditional Probability
Example
What is the probability that the second toss of a fair coin will
result heads, given that heads occurred on the first toss?
Answer
( ) ( ) 5 . 0
2 1 2
= = H P H H P
Statistical dependence exists when the probability of
some event is dependent upon or affected by the
occurrence of some other events.
Marginal Probabilities under Statistical Dependence
the marginal probability of a statistically dependent
event is exactly the same as that of a statistically
independent event; one and only one probability is
involved; a marginal probability refers to only one event
Conditional Probabilities under Statistical
Dependence
computed as:
( )
( )
( ) B P
AB P
B A P =
Example
Assume we have one urn containing 10 balls distributed as follows
3 are red and dotted; 1 is red and striped;
2 are gray and dotted; 4 are gray and striped
The probability of drawing any particular ball from the urn is 0.1, since there are
10 balls, each with equal probability of being drawn
Suppose someone draws a ball from the urn and tells us it is red. What is the
probability that it is dotted?
Answer
2 Separate Categories Color Pattern
Red Dotted
Gray Striped
( )
( )
( )
75 . 0
4
3
= = =
R P
DR P
R D P
Joint Probabilities under Statistical Dependence
calculated as:
( ) ( ) ( ) B P B A P AB P =
TYPE OF
PROBABILITY
SYMBOL
FORMULA
(under
statistical
dependence)
FORMULA
(under
statistical
independence)
Marginal P(A) P(A) P(A)
Joint P(AB) P(A|B) x P(B) P(A) x P(B)
Conditional P(A|B) P(AB)/P(B) P(A)
The basic formula for conditional probability under
conditions of statistical dependence is called Bayes
theorem
What is the probability of getting tails, heads, tails,
in that order on three successive tosses of a fair
coin? Show your solution using
a. Probability calculation
b. Probability tree diagram
a. Using Probability Calculation
( ) ( ) ( ) ( )
( )
( )
0.125 is tosses e successiv two on heads of y probabilit the
T H T P
T H T P
toss third the on tails T
toss second the on heads H
toss f irst the on tails T
where
T P H P T P T H T P
3
2
1
=
=
=
=
=
=
125 . 0
5 . 0 5 . 0 5 . 0
3 2 1
3 2 1
3 2 1 3 2 1
b. Using Probability Tree
0.5
0.5
P(H) = 0.5
P(T) = 0.5
0.25
0.25
P(H) = 0.5
P(T) = 0.5
0.25
0.25
P(H) = 0.5
P(T) = 0.5
0.125
0.125
P(H) = 0.5
P(T) = 0.5
0.125
0.125
P(H) = 0.5
P(T) = 0.5
0.125
0.125
P(H) = 0.5
P(T) = 0.5
0.125
0.25
P(H) = 0.5
P(T) = 0.5
What is the probability of at least one tail on three
successive tosses a fair coin?
a. Using Probability Calculation
( ) ( )
( )
( ) 875 . 0
125 . 0 1
1
3 2 1
3 2 1
=
=
=
tosses three on appear tail one least at P
tosses three on appear tail one least at P
H H H P tosses three on appear tail one least at P
hence
H H H namely occur, tails no which in case one only is There
Assume we have one urn containing 10 balls
distributed as follows:
a. What is P(D|G)?
b. What is P(S|G)?
c. Calculate P(R|D) and P(G|D)
d. Calculate P(R|S) and P(G|S)
2 Separate Categories Color Pattern
Red Dotted
Gray Striped
3 are red and dotted; 1 is red and striped;
2 are gray and dotted; 4 are gray and striped
a. b.
c. d.
( )
( )
( ) 3
1
6
2
= = =
G P
DG P
G D P ( )
( )
( ) 3
2
6
4
= = =
G P
SG P
G S P
( )
( )
( )
6 . 0
5
3
= = =
D P
RD P
D R P
( )
( )
( )
4 . 0
5
2
= = =
D P
GD P
D G P
( )
( )
( )
2 . 0
5
1
= = =
S P
RS P
S R P
( )
( )
( )
8 . 0
5
4
= = =
S P
GS P
S G P
Consider the case of a manufacturer who has an
automatic machine which produces ball bearings.
If the machine is correctly set up, i.e., properly
adjusted, it produces 90 percent acceptable parts.
If it is incorrectly set up, it produces 40 percent
acceptable parts. Past experience indicates that 70
percent of the setups are correctly done. After a
certain setup, the machine produces three
acceptable bearings as the first three pieces. What
is the probability that the setup has been correctly
done?
a.
( )
( )
( )
( )
( )
( )
percent 96.37 or 0.9637 is up set correctly is machine y that the probabilit
9637 . 0
5295 . 0
5103 . 0
parts good 3
parts good 3 ,
parts good 3
the
P
correct P
correct P
B P
AB P
B A P
= = =
=
Event P(event) P(1 good
part|event)
P(3 good
parts|event)
P(event, 3 good
parts)
Correct 0.70 0.90 0.729 0.729 x 0.70 = 0.5103
Incorrect 0.30 0.40 0.064 0.064 x 0.30 = 0.0192
1.00 P(3 good) = 0.5295
Probability Distribution
A list of the outcomes of an experiment with the
probabilities we would expect to see associated with
these outcomes
Discrete and Continuous Distributions
Discrete Probability Distribution
A probability distribution in which the variable is allowed to
take on only a limited number of values
Continuous Probability Distribution
A probability distribution in which the variable is permitted
to take on any value within a given range
Probability Distribution
Can be expressed graphically or in tabular form
0
0.1
0.2
0.3
0.4
0.5
0.6
0 1 2
Probability of a tail in 2 tosses of a fair coin
Probability of this
outcome, P(T)
Number of
tails, T
Probability of this
outcome, P(T)
0 0.25
1 0.5
2 0.25
Random Variables
Random variable
A variable that takes on different values as a result of the
outcomes of a random experiment
Can be discrete or continuous:
A discrete random variable is a variable allowed to take on
only a limited number of values
A continuous random variable is a random variable allowed to
take on any value within a given range
Random Variables
The expected value of a random variable
Expected value is a weighted average of the outcomes of an
experiment
The expected value of a discrete random variable is computed
as:
( ) x P x E E =
Expected value of a
random variable
Symbol meaning
the sum of
Value of the
random variable
Probability that the
random variable will
take on the value x
Number of speakers
sold (the random
variable)
No. of days this
quantity sold
(frequency)
Probability that the random
variable will take on this value
(relative frequency)
[1] x [2]
[1] [2]
100 1 0.01 1.00
101 2 0.02 2.02
102 3 0.03 3.06
103 5 0.05 5.15
104 6 0.06 6.24
105 7 0.07 7.35
106 9 0.09 9.54
107 10 0.10 10.70
108 12 0.12 12.96
109 11 0.11 11.99
110 9 0.09 9.90
111 8 0.08 8.88
112 6 0.06 6.72
113 5 0.05 5.65
114 4 0.04 4.56
115 2 0.02 2.30
100 1.00 108.02
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
0.10
0.11
0.12
Types of Probability Distribution
Binomial distribution
Poisson distribution
Exponential distribution
Normal distribution
Binomial Distribution
A discrete distribution of the results of an experiment
known as a Bernoulli process
Bernoulli process is a process in which each trial has only two
possible outcomes, where the probability of the outcome of
any trial remains fixed over time, and where the trials are
statistically independent
e.g., tossing of a fair coin, success or failure of a college graduate
on a job interview aptitude test
Binomial Distribution
Binomial formula:
( )
trials of number total n
successes of number r
p) - (1 q failure; of y probabilit q
success of y probabilit p
where
! r - n r!
n!
n trials in successes r of robability
=
=
= =
=
=
r n r
q p P
Binomial Distribution
Example:
Some field representatives of the Environmental Protection Agency are
doing spot checks of water pollution in streams. Historically, 8 out of 10
such tests produce favorable results, that is, no pollution. The field
group is going to perform 6 tests and wants to know the chances of
getting exactly 3 favorable results from this group of tests.
( )
( )
( )
( )
6. of out tests favorable 3 getting of 10 in chance 1 than less is There
082 . 0
6
492 . 0
) 0041 . 0 (
6
120
6 of out tests favorable 3 of y Probabilit
) 008 . 0 )( 512 . 0 (
1 2 3 1) 2 (3
1 2 3 4 5 6
6 of out tests favorable 3 of y Probabilit
2 . 0 8 . 0
! 3!3
6!
6 of out tests favorable 3 of y Probabilit
2 . 0 8 . 0
! 3 - 6 3!
6!
6 of out tests favorable 3 of y Probabilit
! r - n r!
n!
6 of out tests favorable 3 of y Probabilit
6 n
3 r
0.2 q
8 . 0 p
! r - n r!
n!
n trials in successes r of robability
3 3
3 6 3
= = =
/ / /
/ / /
=
=
=
=
=
=
=
=
=
r n r
r n r
q p
q p P
Example:
Five employees are required to operate a chemical process; the process
cannot be started until all 5 work stations are manned. Employee records
indicate there is a 0.4 chance of any one employee being late, and we
know that they all come to work independently of each other.
Management is interested in knowing the probabilities of 0,1,2,3,4, or 5
employees being late, so that a decision concerning the number of back-
up personnel can be made.
( )
( )
( )
( ) ( )
( )
1024 . 0
0768 . 0
2304 . 0
3456 . 0
2592 . 0
) 1296 . 0 )( 4 . 0 (
) 1
) 6 . 0 )( 4 . 0 (
!
6 . 0 4 . 0
!
7776 . 0
) 7776 . 0 )( 1 (
) 1
6 . 0 4 . 0
!
4 .
!
4 1 5 1
0 5 0
P(5) : 5 r
P(4) : 4 r
P(3) : 3 r
P(2) : 2 r
P(1)
1 2 3 4 (
1 2 3 4 5
P(1)
4 1!
5!
1 - 5 1!
5!
P(1)
: 1 r f or
P(0)
1 2 3 4 5 (
1 2 3 4 5
P(0)
0 - 5 0!
5!
P(0)
: 0 r f or
5 n
0.6 q
0 p
q p
r - n r!
n!
employees n of out arrivals late r of robability P
r n r
= =
= =
= =
= =
=
/ / / /
/ / / /
=
= =
=
=
/
/ / / / / /
/ / / / /
=
=
=
=
=
=
=
0
0.1
0.2
0.3
0.4
0.5
0.6
0 1 2 3
Binomial(5, 0.1)
p=0.1
n=5
p=0.3
n=5
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0 1 2 3 4 5
Binomial(5, 0.3)
p= 0.4
n=5
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0 1 2 3 4 5
Binomial(5, 0.4)
p= 0.5
n=5
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0 1 2 3 4 5
Binomial(5, 0.5)
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0 1 2 3 4 5
Binomial(5, 0.7)
p=0.7
n=5
p=0.7
n=5
0
0.1
0.2
0.3
0.4
0.5
0.6
0 1 2 3 4 5
Binomial(5, 0.9)
When p is small (0.1), the binomial distribution is
skewed to the right
As p increases (to 0.3), the skewness is less noticeable
When p=0.5, the binomial distribution is symmetrical
When p is larger than 0.5, the distribution is skewed to
the left
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0 1 2 3 4 5
Binomial(5, 0.4)
p=0.4
n=5
p=0.4
n=10
0
0.05
0.1
0.15
0.2
0.25
0.3
0 1 2 3 4 5 6 7 8 9
Binomial(10, 0.4)
p=0.4
n=20
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Binomial(20, 0.4)
As n increases, the vertical lines become more
numerous and tend to bunch up together to form
something like a bell-shape.
Poisson Distribution
A discrete distribution in which the probability of the
occurrence of an event within a small time period is
very small, in which the probability that two or more
such event will occur within the same time interval is
effectively 0, and in which the probability of the
occurrence of the event within one time period is
independent of where that period is.
Poisson Distribution
The probability of exactly x occurrences in a Poisson
distribution is calculated using the formula:
=
!
e
) (
x
x P
x
Probability of exactly x
occurrences
Lambda (the average number of
occurrences per interval of time) raised
to the x power
X factorial
e (2.71828, the base of
the natural logarithm
system) raised to the
negative lambda power
Exponential Distribution
A continuous probability distribution used to describe