0% found this document useful (0 votes)
33 views42 pages

PPT6-Probability and Random Variables

Four fair coins are flipped independently. The number of heads obtained is a binomial random variable with n=4 trials and probability of success p=1/2 on each trial. The probability of obtaining exactly two heads (and two tails) is the binomial probability P(X=2).

Uploaded by

Fajar Prasetyo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views42 pages

PPT6-Probability and Random Variables

Four fair coins are flipped independently. The number of heads obtained is a binomial random variable with n=4 trials and probability of success p=1/2 on each trial. The probability of obtaining exactly two heads (and two tails) is the binomial probability P(X=2).

Uploaded by

Fajar Prasetyo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 42

ISYE6189 - Deterministic

Optimization and Stochastic


Processes
Topic 6 - Week 6
Probability & Random Variables
Learning Outcome
• LO3: Apply the concept of discrete and
continuous time Markov chain, transition
matrices and state classifications
PROBABILITY & RANDOM
VARIABLES
Sub Topic:
- Conditional Probability
- Bayes Theorem
- Discrete Random Variables: Bernoulli, Binomial, Poisson
- Continuous Random variables: normal, Exponential, Gamma
- Conditional Expectation
Sample spaces and events
Envision an experiment for which the result is unknown.
• The collection of all possible outcomes is called the sample
space.
– Sample spaces can be discrete: {HH, HT, TH, TT} for two
coins
– Or continuous, e.g., [0, ),
• A set of outcomes, or subset of the sample space, is called
an event. If E and F are events,
– EF is the event that either E or F (or both) occurs
– EF = EF is the event that both E and F occur. If EF = 
then E and F are called mutually exclusive.
– The complement of E is: the event that E does not occur
• A probability space is a three-tuple (S ,, P) where S is a sample
space,  is a collection of events from the sample space and P is a
probability law that assigns a number to each event in . P must
satisfy:
– P(S) = 1
– 0  P(A)  1
– For any collection of mutually exclusive events E1, E2, …,
– If S is discrete, then  is the set of all subsets of S
– If S is continuous, then  can be defined in terms of “basic
events of interest,” e.g., if S = [0, 1] the basic events could be all
(a, b) with 0  a < b  1. Then  would be the set of intervals
along with all their countable unions and intersections.
In the die tossing example, if we supposed that all six numbers
were equally likely to appear, then we would have

P({1}) = P({2}) = P({3}) = P({4}) = P({5}) = P({6}) = 16

it would follow that the probability of getting an even number


would equal

P({2, 4, 6}) = P({2}) + P({4}) + P({6})= 12


Probability
• If S consists of n equally likely outcomes, then the
probability of each is 1/n.
• Since E and Ec are mutually exclusive and E  Ec = S,
• P(E  F) = P(E) + P(F) – P(EF)
• P E  F  G) = P(E) + P(F) + P(G) – P(EF) – P(EG) – P(FG) +
P(EFG)
• Generalizes to any number of events.
• Use Venn diagrams!
Suppose that we toss two coins, and suppose that we assume that each
of the four outcomes in the sample space

S = {(H,H), (H, T), (T,H), (T, T)}

is equally likely and hence has probability 14 . Let


E = {(H,H), (H, T)} and F = {(H,H), (T,H)}

That is, E is the event that the first coin falls heads, and F is the event
that the second coin falls heads.
CONDITIONAL PROBABILITIES
Conditional Probabilities

• If A and B are events with P(B)  0, the conditional probability of A given B is


P  A  B
P  A B 
P B
• This formula also tells how to find the probability of AB:
P  A  B   P  A B  P  B   P  A P  B A
P  AB   P  A  P  B 

• A and B are independent events if P  A | B   P  A 


or equivalently if
P  E1' E2'  Er '   P  E1'  P  E2'  P  Er' 
If events are mutually exclusive, are they independent? Vice versa?
• A set of events E1, E2, …, En are independent if for every subset E1’, E2’, …, Er’,
(pairwise independence is not enough – see Example 1.10)
Example
Suppose cards numbered one through ten are placed in a hat, mixed up, and
then one of the cards is drawn. If we are told that the number on the drawn card
is at least five, then what is the conditional probability that it is ten?

Solution: Let E denote the event that the number of the drawn card is ten, and
let F be the event that it is at least five. The desired probability is P(E|F).
Example
A family has two children. What is the conditional probability that both are boys
given that at least one of them is a boy? Assume that the sample space S is given
by S = {(b, b), (b, g), (g, b), (g, g)}, and all outcomes are equally likely. ((b, g) means,
for instance, that the older child is a boy and the younger child a girl.)

Solution: Letting B denote the event that both children are boys, and A the event
that at least one of them is a boy, then the desired probability is given by
Bev can either take a course in computers or in chemistry. If Bev takes the
computer course, then she will receive an A grade with probability ½ ; if she takes
the chemistry course then she will receive an A grade with probability 1/3.
Bev decides to base her decision on the flip of a fair coin. What is the probability
that Bev will get an A in chemistry?

Solution: If we let C be the event that Bev takes chemistry and A denote the
event that she receives an A in whatever course she takes, then the desired
probability is P(AC).
BAYES’ THEOREM
Bayes’ Formula
• The law of total probability says that if E and F are any two events, then since
E = EF  EFc,
P  E   P  EF  EF c   P  EF   P  EF c 
 P  E | F  P  F   P  E | F c  1  P  F 
• It can be generalized to any partition of S: F1, F2, …, Fn mutually exclusive
events with n

Fi  S
i 1
• Bayes’ formula is relevant if we know that E occurred and we want to know
which of the F’s occurred.
P  E    i 1 P  E | Fi P  Fi 
n

P  EFj  P  E | Fj  P  Fj 
P  Fj | E   
P E  
n
P  E | Fi  P  Fi 
i 1
Consider two urns. The first contains two white and seven black balls, and the
second contains five white and six black balls. We flip a fair coin and then draw a
ball from the first urn or the second urn depending on whether the outcome was
heads or tails. What is the conditional probability that the outcome of the toss
was heads given that a white ball was selected?

Solution: Let W be the event that a white ball is drawn, and let H be the event
that the coin comes up heads. The desired probability P(H|W) may be calculated
as follows:
In answering a question on a multiple-choice test a student either knows the
answer or guesses. Let p be the probability that she knows the answer and 1 − p
the probability that she guesses. Assume that a student who guesses at the
answer will be correct with probability 1/m, where m is the number of multiple-
choice alternatives. What is the conditional probability that a student knew the
answer to a question given that she answered it
correctly?

Solution: Let C and K denote respectively the event that the student answers the
question correctly and the event that she actually knows the answer. Now
DISCRETE RANDOM VARIABLES
Definition of random variable
A random variable is a function that assigns a number to
each outcome in a sample space.
• If the set of all possible values of a random variable X is
countable, then X is discrete. The distribution of X is
described by a probability mass function:
p  a   P s  S : X  s   a  P X  a
• Otherwise, X is a continuous random variable if there is a
nonnegative function f(x), defined for all real numbers x,
such that for any set B,
P s  S : X  s   B  P  X  B   f  x dx
f(x) is called the probability density function B
of X.
pmf’s and cdf’s
• The probability mass function (pmf) for a discrete random
variable is positive for at most a countable number of

values of X: x1, x2, …, and

i 1
px  1
i

• The cumulative distribution function (cdf) for any random


variable X is
F(x) is a nondecreasing function with
F  x   P  X  x

• For a discrete random variable X,


lim F  x   0 and lim F  x   1 F  a    p  xi 
x  x 
xi  a
Bernoulli Random Variable
• An experiment has two possible outcomes, called
“success” and “failure”: sometimes called a Bernoulli trial
• The probability of success is p
• X = 1 if success occurs, X = 0 if failure occurs
Then p(0) = P{X = 0} = 1 – p and p(1) = P{X = 1} = p

X is a Bernoulli random variable with parameter p.


Binomial Random Variable
• A sequence of n independent Bernoulli trials are
performed, where the probability of success on each trial
is p
• X is the number of successes
Then for i = 0, 1, …, n,
 n i
p i   P  X  i    p 1  p 
n i

i
where
 n n!
 
 i  i ! n  i !
X is a binomial random variable with parameters n and p.
Four fair coins are flipped. If the outcomes are assumed independent,
what is the probability that two heads and two tails are obtained?

Solution: Letting X equal the number of heads (“successes”) that appear,


then X is a binomial random variable with parameters
It is known that any item produced by a certain machine will be
defective with probability 0.1, independently of any other item. What is
the probability that in a sample of three items, at most one will be
defective?

Solution: If X is the number of defective items in the sample, then X is a


binomial random variable with parameters (3, 0.1). Hence, the desired
probability is given by
Poisson Random Variable
X is a Poisson random variable with parameter l > 0 if
e   i
p i   P  X  i  , i  0,1,...
i!
note:  i 0 p i   1 follows from e   i 0  i i !
 

X can represent the number of “rare events” that occur


during an interval of specified length
A Poisson random variable can also approximate a binomial
random variable with large n and small p if l = np: split the
interval into n subintervals, and label the occurrence of an
event during a subinterval as “success”.
Suppose that the number of typographical errors on a single page of
this book has a Poisson distribution with parameter λ = 1. Calculate the
probability that there is at least one error on this page.

Solution:

If the number of accidents occurring on a highway each day is a Poisson


random variable with parameter λ = 3, what is the probability that no
accidents occur today?

Solution:
CONTINUOUS RANDOM VARIABLES
Continuous random variables
A probability density function (pdf) must satisfy:
f x  0

 f  x  dx  1

b
P a  X  b   f  x  dx (note P  X  a  0)

The cdf is: F  a   P  X  a  a f  x  dx, so f  x   dF  x 


a

 dx
  
P a   X  a     f  a 
 2 2
Uniform random variable

X is uniformly distributed over an interval (a, b) if its pdf is


 1
 , a xb
f  x  b  a
0, otherwise
Then its cdf is:
0, x  a
xa

F x   ,a xb
b  a
1, x  b
If X is uniformly distributed over (0, 10), calculate the probability that

(a) X < 3, (b) X > 7, (c) 1 < X < 6.


Exponential random variable
X has an exponential distribution with parameter l > 0 if its
pdf is
 e   x , x  0
f x  
0, otherwise
Then its cdf is:

0, x  0
F x    x
1  e ,x0
This distribution has very special characteristics that we will
use often!
Gamma random variable
X has an gamma distribution with parameters l > 0 and a > 0 if
its pdf is
  e   x   x  1
 , x0
f x     

0, otherwise

It gets its name from the gamma function     e  x x 1dx
 0

If a is an integer, then       1!


EXPECTATION
Expectation

Expected value (mean) of a random variable is

  i xi p  xi , discrete

EX    
 - xf  x  dx, continuous
Also called first moment – like moment of inertia of the
probability distribution
If the experiment is repeated and random variable
observed many times, it represents the long run
average value of the r.v.
Find E[X] where X is the outcome when we roll a fair die.

Solution: Since p(1) = p(2) = p(3) = p(4) = p(5) = p(6) =1/6, we obtain
Expectations of Discrete Random Variables

• Bernoulli: E[X] = 1(p) + 0(1-p) = p


• Binomial: E[X] = np
• Geometric: E[X] = 1/p (by a trick, see text)
• Poisson: E[X] = l : the parameter is the expected or average
number of “rare events” per interval; the random variable is
the number of events in a particular interval chosen at
random
Expectations of Continuous Random Variables

• Uniform: E[X] = (a + b)/2


• Exponential: E[X] = 1/l
• Gamma: E[X] = ab
• Normal: E[X] = m : the first parameter is the expected
value: note that its density is symmetric about x = m:
1  x   2
f x 
2 2
e ,   x  
2
Higher-order moments

The nth moment of X is E[Xn]:


  xi n p  xi , discrete
 i
E  X    
n

 -  f  x  dx, continuous
n
x
The variance is
Var  X   E  X  E  X  
 2

 
It is sometimes easier to calculate as

Var  X   E  X    E  X 
2 2
Calculate Var(X) when X represents the outcome when a fair die
is rolled.

Solution: As previously noted in Example, E[X] =7/2 . Also,


Discrete conditional distributions

Given a joint probability mass function


p  x, y   P  X  x, Y  y
the conditional pmf of X given that Y = y is
p  x, y 
p X Y  x y   P  X  x Y  y  if pY  y   0
pY  y 
The conditional expectation of X given Y = y is

E  X Y  y    xp X Y  x y 
x
Suppose that p(x, y), the joint probability mass
function of X and Y, is given by p(1, 1) = 0.5, p(1, 2)
= 0.1, p(2, 1) = 0.1, p(2, 2) = 0.3 Calculate the
conditional probability mass function of X given
that Y = 1.

Solution: We first note that


Thank You

You might also like