PPT6-Probability and Random Variables
PPT6-Probability and Random Variables
That is, E is the event that the first coin falls heads, and F is the event
that the second coin falls heads.
CONDITIONAL PROBABILITIES
Conditional Probabilities
Solution: Let E denote the event that the number of the drawn card is ten, and
let F be the event that it is at least five. The desired probability is P(E|F).
Example
A family has two children. What is the conditional probability that both are boys
given that at least one of them is a boy? Assume that the sample space S is given
by S = {(b, b), (b, g), (g, b), (g, g)}, and all outcomes are equally likely. ((b, g) means,
for instance, that the older child is a boy and the younger child a girl.)
Solution: Letting B denote the event that both children are boys, and A the event
that at least one of them is a boy, then the desired probability is given by
Bev can either take a course in computers or in chemistry. If Bev takes the
computer course, then she will receive an A grade with probability ½ ; if she takes
the chemistry course then she will receive an A grade with probability 1/3.
Bev decides to base her decision on the flip of a fair coin. What is the probability
that Bev will get an A in chemistry?
Solution: If we let C be the event that Bev takes chemistry and A denote the
event that she receives an A in whatever course she takes, then the desired
probability is P(AC).
BAYES’ THEOREM
Bayes’ Formula
• The law of total probability says that if E and F are any two events, then since
E = EF EFc,
P E P EF EF c P EF P EF c
P E | F P F P E | F c 1 P F
• It can be generalized to any partition of S: F1, F2, …, Fn mutually exclusive
events with n
Fi S
i 1
• Bayes’ formula is relevant if we know that E occurred and we want to know
which of the F’s occurred.
P E i 1 P E | Fi P Fi
n
P EFj P E | Fj P Fj
P Fj | E
P E
n
P E | Fi P Fi
i 1
Consider two urns. The first contains two white and seven black balls, and the
second contains five white and six black balls. We flip a fair coin and then draw a
ball from the first urn or the second urn depending on whether the outcome was
heads or tails. What is the conditional probability that the outcome of the toss
was heads given that a white ball was selected?
Solution: Let W be the event that a white ball is drawn, and let H be the event
that the coin comes up heads. The desired probability P(H|W) may be calculated
as follows:
In answering a question on a multiple-choice test a student either knows the
answer or guesses. Let p be the probability that she knows the answer and 1 − p
the probability that she guesses. Assume that a student who guesses at the
answer will be correct with probability 1/m, where m is the number of multiple-
choice alternatives. What is the conditional probability that a student knew the
answer to a question given that she answered it
correctly?
Solution: Let C and K denote respectively the event that the student answers the
question correctly and the event that she actually knows the answer. Now
DISCRETE RANDOM VARIABLES
Definition of random variable
A random variable is a function that assigns a number to
each outcome in a sample space.
• If the set of all possible values of a random variable X is
countable, then X is discrete. The distribution of X is
described by a probability mass function:
p a P s S : X s a P X a
• Otherwise, X is a continuous random variable if there is a
nonnegative function f(x), defined for all real numbers x,
such that for any set B,
P s S : X s B P X B f x dx
f(x) is called the probability density function B
of X.
pmf’s and cdf’s
• The probability mass function (pmf) for a discrete random
variable is positive for at most a countable number of
values of X: x1, x2, …, and
i 1
px 1
i
i
where
n n!
i i ! n i !
X is a binomial random variable with parameters n and p.
Four fair coins are flipped. If the outcomes are assumed independent,
what is the probability that two heads and two tails are obtained?
Solution:
Solution:
CONTINUOUS RANDOM VARIABLES
Continuous random variables
A probability density function (pdf) must satisfy:
f x 0
f x dx 1
b
P a X b f x dx (note P X a 0)
dx
P a X a f a
2 2
Uniform random variable
0, x 0
F x x
1 e ,x0
This distribution has very special characteristics that we will
use often!
Gamma random variable
X has an gamma distribution with parameters l > 0 and a > 0 if
its pdf is
e x x 1
, x0
f x
0, otherwise
It gets its name from the gamma function e x x 1dx
0
i xi p xi , discrete
EX
- xf x dx, continuous
Also called first moment – like moment of inertia of the
probability distribution
If the experiment is repeated and random variable
observed many times, it represents the long run
average value of the r.v.
Find E[X] where X is the outcome when we roll a fair die.
Solution: Since p(1) = p(2) = p(3) = p(4) = p(5) = p(6) =1/6, we obtain
Expectations of Discrete Random Variables
- f x dx, continuous
n
x
The variance is
Var X E X E X
2
It is sometimes easier to calculate as
Var X E X E X
2 2
Calculate Var(X) when X represents the outcome when a fair die
is rolled.
E X Y y xp X Y x y
x
Suppose that p(x, y), the joint probability mass
function of X and Y, is given by p(1, 1) = 0.5, p(1, 2)
= 0.1, p(2, 1) = 0.1, p(2, 2) = 0.3 Calculate the
conditional probability mass function of X given
that Y = 1.