Chapter 06 - Jointly Distributed Random Variables
Chapter 06 - Jointly Distributed Random Variables
RANDOM VARIABLES
Kutay TİNÇ, Ph.D.
JOINT DISTRIBUTION
FUNCTIONS
are often interested in probability statements concerning two or more random
We
variables. In order to deal with such probabilities, we define, for any two random
variables and , the joint cumulative probability distribution function of and by:
Sum
JOINT PDF (CONTINUOUS)
say that and are jointly continuous if there exists a function , defined for all real
We
and , having the property that, for every set of pairs of real numbers (that is, is a set
in the two-dimensional plane),
The function is called the joint probability density function of and . If and are any
sets of real numbers, then, by defining , we see from equation above:
MARGINAL PDFS
X and Y are jointly continuous, they are also individually continuous, and their
If
probability density functions can be obtained as follows:
EXAMPLE 2
The joint density function of X and Y is given by
Compute:
a) ,
b) ,
c) .
EXAMPLE 2 - CONTINUED
EXAMPLE 3
The joint density of X and Y is given by:
a) Find c.
b) Find the marginal densities of X and Y.
MORE THAN 2 RANDOM
VARIABLES
can also define joint probability distributions for random variables in exactly the
We
same manner as we did for . For instance, the joint cumulative probability distribution
function of the random variables is defined by:
Further, the n random variables are said to be jointly continuous if there exists a
function , called the joint probability density function, such that for any set in -space
MULTINOMIAL DISTRIBUTION
Multinomial distribution arises when a sequence of independent and identical
experiments are performed. Suppose that each experiment can result in any one of
possible outcomes, with respective probabilities
If we let denote the number of the experiments that result in outcome number , then
whenever .
EXAMPLE 5
Suppose that a fair die is rolled 9 times. Find the probability that 1 appears three
times, 2 and 3 twice each, 4 and 5 once each, and 6 not at all.
INDEPENDENT RANDOM
VARIABLES
The random variables and are said to be independent if for any two sets of real
numbers and ,
In other words, and are independent if, for all and , the events and are independent.
The equation above can also be translated into:
Hence, in terms of the joint distribution function of and , and are independent if
INDEPENDENT RANDOM
VARIABLES
Suppose that independent trials having a common probability of success are
performed. If is the number of successes in the first trials, and is the number of
successes in the final trials, then and are independent, since knowing the number of
successes in the first trials does not affect the distribution of the number of successes
in the final trials (by the assumption of independent trials). In fact, for integral and ,
In contrast, and will be dependent, where is the total number of successes in the
trials. (Why?)
EXAMPLE 6
man and a woman decide to meet at a certain location. If each of them
A
independently arrives at a time uniformly distributed between 12 noon and 1 P.M.,
find the probability that the first to arrive has to wait longer than 10 minutes.
If we let and denote, respectively, the time past 12 that the man and the woman
arrive, then and are independent random variables, each of which is uniformly
distributed over . The desired probability, , which, by symmetry, equals , is obtained
as follows:
INDEPENDENT RANDOM
VARIABLES
necessary and sufficient condition for the random variables and to be independent
A
is for their joint probability density function (or joint probability mass function in the
discrete case) to factor into two terms, one depending only on and the other
depending only on .
The continuous (discrete) random variables and are independent if and only if their
joint probability density (mass) function can be expressed as:
EXAMPLE 7
If the joint density function of and is:
and is equal to 0 outside this region, are the random variables independent? What if
the joint density function is:
and 0 otherwise?
The first Joint Density Functions factors and thus the random variables are
independent, but in the second one the region cannot be expressed independently as ,
and hence the random variables are not independent.
EXAMPLE 8
Let be independent and uniformly distributed over . Compute .
SUMS OF INDEPENDENT
RANDOM VARIABLES
is often important to be able to calculate the distribution of from the distributions
It
of and when and are independent. is called the convolution of the distributions
and .
EXAMPLE 8
X and Y are independent random variables, both uniformly distributed on (0, 1), calculate
If
the probability density of .
If is independent of , then the conditional mass function and distribution function are the
same as the unconditional ones:
EXAMPLE 11
Suppose that , the joint probability mass function of and , is given by
1. The equations and can be uniquely solved for and in terms of and with
solutions given by, say, , .
2. For all points , the functions and have continuous partial derivatives and are
such that the following 2 by 2 determinant (Jacobian) holds:
JOINT PROBABILITY
DISTRIBUTION OF FUNCTIONS OF
RANDOM VARIABLES
Under these two conditions it can be shown that the random variables Y and Y are
1 2
Also, as the equations and have their solution , the desired density is:
EXAMPLE 14
Let and be independent standard normal random variables. For , compute the joint
density function of .
Letting , the Jacobian of these transformations is given by
What is the expected value of the time until the electronic device goes down?
The expected value of the time until the electronic device goes down is given by:
EXERCISE 4
Let X and Y be independent and exponentially distributed random variables with
parameters 1 and 2, respectively. Find the joint density function of S = and R = and
the expected value of .
EXERCISE 5
the final of the World Series Baseball, two teams play a series of at most seven
In
games until one of the two teams has won four games. Two unevenly matched teams
face each other and the probability that the weaker team will win any given game is
equal to 0.45. What is the joint probability mass function of the number of games
played in the final if we know that the weaker team has won the final?
Let be equal to 0 if the weaker team is the winner of the final and be the number of
matches played.
EXERCISE 6
The joint probability mass function (PMF) of the lifetimes and of two
connected components in a machine can be modeled by: