0% found this document useful (0 votes)
42 views

Chapter 06 - Jointly Distributed Random Variables

This document discusses joint probability distributions of random variables. It defines joint distribution functions and probability density functions for both discrete and continuous random variables. It also discusses concepts such as marginal distributions, independence of random variables, and conditional probability distributions. Several examples are provided to illustrate computing probabilities and distributions for sums, functions, and conditional distributions of random variables.

Uploaded by

Batu Gün
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views

Chapter 06 - Jointly Distributed Random Variables

This document discusses joint probability distributions of random variables. It defines joint distribution functions and probability density functions for both discrete and continuous random variables. It also discusses concepts such as marginal distributions, independence of random variables, and conditional probability distributions. Several examples are provided to illustrate computing probabilities and distributions for sums, functions, and conditional distributions of random variables.

Uploaded by

Batu Gün
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 47

JOINTLY DISTRIBUTED

RANDOM VARIABLES
Kutay TİNÇ, Ph.D.
JOINT DISTRIBUTION
FUNCTIONS
  are often interested in probability statements concerning two or more random
We
variables. In order to deal with such probabilities, we define, for any two random
variables and , the joint cumulative probability distribution function of and by:

Marginal distributions of and , and can be obtained as:


JOINT DISTRIBUTION
FUNCTIONS
  joint probability statements about X and Y can, in theory, be answered in terms of
All
their joint distribution function. For instance, suppose we wanted to compute the joint
probability that X is greater than a and Y is greater than b. This could be done as
follows:
JOINT PMF (DISCREET)
  the case where and are both discrete random variables, it is convenient to define
In
the joint probability mass function of and by

Marginal distributions of and can be obtained as follows:


EXAMPLE 1
 
Suppose that 3 balls are randomly selected from an urn containing 3 red, 4 white, and
5 blue balls. If we let and denote, respectively, the number of red and white balls
chosen, find the joint probability mass function of and ,
EXAMPLE 1 - CONTINUED
0 1 2 3 Sum

Sum
JOINT PDF (CONTINUOUS)
  say that and are jointly continuous if there exists a function , defined for all real
We
and , having the property that, for every set of pairs of real numbers (that is, is a set
in the two-dimensional plane),

The function is called the joint probability density function of and . If and are any
sets of real numbers, then, by defining , we see from equation above:
MARGINAL PDFS
  X and Y are jointly continuous, they are also individually continuous, and their
If
probability density functions can be obtained as follows:
EXAMPLE 2
 The joint density function of X and Y is given by

Compute:
a) ,
b) ,
c) .
EXAMPLE 2 - CONTINUED
 
EXAMPLE 3
 The joint density of X and Y is given by:

Find the density function of the random variable .

Taking the derivative gives:


EXAMPLE 4
 The joint probability density function of and is given by:

a) Find c.
b) Find the marginal densities of X and Y.
MORE THAN 2 RANDOM
VARIABLES
  can also define joint probability distributions for random variables in exactly the
We
same manner as we did for . For instance, the joint cumulative probability distribution
function of the random variables is defined by:

Further, the n random variables are said to be jointly continuous if there exists a
function , called the joint probability density function, such that for any set in -space
MULTINOMIAL DISTRIBUTION
 
Multinomial distribution arises when a sequence of independent and identical
experiments are performed. Suppose that each experiment can result in any one of
possible outcomes, with respective probabilities

If we let denote the number of the experiments that result in outcome number , then

whenever .
EXAMPLE 5
 
Suppose that a fair die is rolled 9 times. Find the probability that 1 appears three
times, 2 and 3 twice each, 4 and 5 once each, and 6 not at all.
INDEPENDENT RANDOM
VARIABLES
 
The random variables and are said to be independent if for any two sets of real
numbers and ,

In other words, and are independent if, for all and , the events and are independent.
The equation above can also be translated into:

Hence, in terms of the joint distribution function of and , and are independent if
INDEPENDENT RANDOM
VARIABLES
 
Suppose that independent trials having a common probability of success are
performed. If is the number of successes in the first trials, and is the number of
successes in the final trials, then and are independent, since knowing the number of
successes in the first trials does not affect the distribution of the number of successes
in the final trials (by the assumption of independent trials). In fact, for integral and ,

In contrast, and will be dependent, where is the total number of successes in the
trials. (Why?)
EXAMPLE 6
  man and a woman decide to meet at a certain location. If each of them
A
independently arrives at a time uniformly distributed between 12 noon and 1 P.M.,
find the probability that the first to arrive has to wait longer than 10 minutes.
If we let and denote, respectively, the time past 12 that the man and the woman
arrive, then and are independent random variables, each of which is uniformly
distributed over . The desired probability, , which, by symmetry, equals , is obtained
as follows:
INDEPENDENT RANDOM
VARIABLES
  necessary and sufficient condition for the random variables and to be independent
A
is for their joint probability density function (or joint probability mass function in the
discrete case) to factor into two terms, one depending only on and the other
depending only on .
The continuous (discrete) random variables and are independent if and only if their
joint probability density (mass) function can be expressed as:
EXAMPLE 7
 If the joint density function of and is:

and is equal to 0 outside this region, are the random variables independent? What if
the joint density function is:

and 0 otherwise?
The first Joint Density Functions factors and thus the random variables are
independent, but in the second one the region cannot be expressed independently as ,
and hence the random variables are not independent.
EXAMPLE 8
 Let be independent and uniformly distributed over . Compute .
SUMS OF INDEPENDENT
RANDOM VARIABLES
  is often important to be able to calculate the distribution of from the distributions
It
of and when and are independent. is called the convolution of the distributions
and .
EXAMPLE 8
  X and Y are independent random variables, both uniformly distributed on (0, 1), calculate
If
the probability density of .

For it can be written as:


For it can be written as:
Hence,
EXAMPLE 9
 If and are independent Poisson random variables with respective parameters and , compute the
distribution of .
Because the event may be written as the union of the disjoint events ,

In other words has a Poisson distribution with parameter .


EXAMPLE 10
 Let and be independent binomial random variables with respective parameters
and . Calculate the distribution of .
Without any computation at all we can immediately conclude, by recalling the
interpretation of a binomial random variable, that is binomial with parameters If we
want mathematical proof:
CONDITIONAL
DISTRIBUTIONS: DISCRETE
CASE
  and are discrete random variables, we can define the conditional probability mass
If
function of given that by

for all values of such that .


CONDITIONAL
DISTRIBUTIONS: DISCRETE
CASE
 
Similarly, the conditional probability distribution function of given that is defined, for all
such that , by

If is independent of , then the conditional mass function and distribution function are the
same as the unconditional ones:
EXAMPLE 11
 Suppose that , the joint probability mass function of and , is given by

Calculate the conditional probability mass function of , given that .


CONDITIONAL
DISTRIBUTIONS:
CONTINUOUS CASE
  and have a joint probability density function , then the conditional probability
If
density function of , given that , is defined for all values of such that , by

The use of conditional densities allows us to define conditional probabilities of events


associated with one random variable when we are given the value of a second random
variable. That is, if and are jointly continuous, then for any set ,
CONDITIONAL
DISTRIBUTIONS:
CONTINUOUS CASE
 In particular by letting , we can define the conditional cumulative distribution
function of , given that , by
EXAMPLE 12
 Find where the joint density of and is given by
CONDITIONAL
DISTRIBUTIONS:
CONTINUOUS CASE
 
Ifand are independent continuous random variables, the conditional density of , given
, is just the unconditional density of . This is so because, in the independent case,
JOINT PROBABILITY
DISTRIBUTION OF FUNCTIONS OF
RANDOM VARIABLES
 Let and be jointly continuous random variables with joint probability density
function .
It is sometimes necessary to obtain the joint distribution of the random
variables and which arise as functions of and .
Specifically, suppose that and for some functions and .
JOINT PROBABILITY
DISTRIBUTION OF FUNCTIONS OF
RANDOM VARIABLES
 Assume that the functions and satisfy the following conditions:

1. The equations and can be uniquely solved for and in terms of and with
solutions given by, say, , .

2. For all points , the functions and have continuous partial derivatives and are
such that the following 2 by 2 determinant (Jacobian) holds:
JOINT PROBABILITY
DISTRIBUTION OF FUNCTIONS OF
RANDOM VARIABLES
 Under these two conditions it can be shown that the random variables Y and Y are
1 2

jointly continuous with joint density function given by

where x1 = h1(y1, y2) , x2 = h2(y1, y2).


EXAMPLE 13
 Let and be jointly continuous random variables with probability density function .
Let . Find the joint density function of and in terms of .
Let and
Then .

Also, as the equations and have their solution , the desired density is:
EXAMPLE 14
 Let and be independent standard normal random variables. For , compute the joint
density function of .
Letting , the Jacobian of these transformations is given by

Variable transformations yield:


EXAMPLE 14 - CONTINUED
 
EXERCISE 1
 and have joint density function:

a) Compute the joint density function of .


b) What are the marginal densities of and ?
EXERCISE 1 - SOLUTION
 
EXERCISE 1 - SOLUTION
 
EXERCISE 2
 Let X and Y be random losses with joint density function

An insurance policy is written to reimburse . Calculate the probability that the


reimbursement is less than 1.
EXERCISE 3
  electronic device contains two circuits. The second circuit is a backup for the first
An
and is switched on only when the first circuit has failed. The electronic device goes
down when the second circuit fails. The continuous random variables and denote the
lifetimes of the first circuit and the second circuit and have the joint density function

What is the expected value of the time until the electronic device goes down?
The expected value of the time until the electronic device goes down is given by:
EXERCISE 4
 
Let X and Y be independent and exponentially distributed random variables with
parameters 1 and 2, respectively. Find the joint density function of S = and R = and
the expected value of .
EXERCISE 5
  the final of the World Series Baseball, two teams play a series of at most seven
In
games until one of the two teams has won four games. Two unevenly matched teams
face each other and the probability that the weaker team will win any given game is
equal to 0.45. What is the joint probability mass function of the number of games
played in the final if we know that the weaker team has won the final?
Let be equal to 0 if the weaker team is the winner of the final and be the number of
matches played.
EXERCISE 6
 The joint probability mass function (PMF) of the lifetimes and of two
connected components in a machine can be modeled by:

a) Find the marginal PMFs of and .


b) What is the joint probability mass function of and ? Are and independent? Explain.
EXERCISE 6 - SOLUTION
 

You might also like