Chapter 4
Chapter 4
CHAPTER FOUR
4. JOINT AND CONDITIONAL PROBABILITY DISTRIBUTIONS
4.1. Joint and Marginal Probability Distributions
4.1.1. Joint probability distribution
If X and Y are two random variable the probability distribution for their simultaneous
occurrences can be represented by a function f(x,y),for any pair values (X,Y) within the range of
the random variable X and Y. This function known as joint probability distribution (X, Y)
Definition: 1. Let (x, y) is a two-dimensional discrete random variable with each possible
outcome (Xi, Yi) we associate a number (Xi, Yi) representing P(X=Xi, Y=Yi) and satisfying the
following conditions.
1. ( )
2. ∑ ∑ ( )
The function P is joint probability mass function( ). The set of triples,( ) ( )-,
i=j=1, 2,3,…, is the joint probability distribution of ( ).
Definition: 2. Let (X, Y) be a continuous random variable. If it assuming all values in some
region R of the Euclidean plane. Let (X, Y) be tow-dimensional continuous random variable
then the joint probability density function f is a function satisfying the following conditions:
1. ( )
2. ∬ ( )
Example:
1. Two production lines 1 and 2 have a capacity of producing 5 and 3 items per day
respectively, assume the numbers of items produced by each line is a random variable. Let
( ) be a two-dimensional random variable yielding the number of items produced by line 1
and line 2 respectively.
Y|X 0 1 2 3 4 5
0 0 0.01 0.03 0.05 0.07 0.09
1 0.01 0.02 0.04 0.05 0.06 0.08
2 0.01 0.03 0.05 0.05 0.05 0.06
3 0.01 0.02 0.04 0.06 0.06 0.05
A. Show that ( ) is a legitimate probability function of ( ).
B. What is the probability that both lines produce the same numbers of items?
C. What is the probability that more items are produce by line 2?
2. Let ( ) be a two-dimensional discrete random variable with
B. ( ) ( ) ( ) ( ) ( )
( ) ( ) ( ) ( )
1, C. ( ) ( ) ( ) ( ) ( ) ( ) ( )
=
2. A.
I. ( )
( )
II. ∑ ∑ ( ) ∑ ∑
( )
∑ ∑ =∑ ∑
=. / ( )
= . / . /
= ( ) ( ) ( ) ( )
3. A
I. ( )
BY: Habitamu W. Page 2
Statistics for Economists (Econ2042) 2024/2025
( ) ( )
II. ∬ ⇒∫ ∫ ∫ ∫
( )
3B. ( ) ( ) ∫ ∫
∫ ( ) ∫ ( )
NB ( ) ( ) ∫ ∫ ( ) ∫ ( )
Example:
1. let (x,y) be the joint probability function given by
Y|X 0 1 2 ( )
0 0.25 0.15 0.1 0.50
1 0.1 0.08 0.1 0.28
2 0.05 0.07 0.1 0.22
( ) 0.40 0.30 0.30 1
Find the marginal distribution function of X and Y
Solution:
i. The marginal distribution function of X is
X= 0 1 2 Total
( ) 0.40 0.30 0.30 1
Y= 0 1 2 Total
( ) 0.50 0.28 0.22 1
2. The joint PMF of two random variables X and Y is given by
( )
PX,Y(x, y) ={
Where k is a constant:
A. What is the value of K?
B. Find the marginal PMFs of X and Y.
Solution:
A. To evaluate k, we remember that
∑ ∑ ( )=∑ ∑ ( )=1
Thus∑ ∑ ( )= ∑ ,( ) ( )-=1
= ∑ ( )
= ,( ) ( )-
( ) ∑ ( )= ∑ ( ) *( ) ( )+
= ( )
( ) ∑ ( )= ∑ ( )= *( ) ( )+
= ( )
( ) ∫ ( ) = ( ) ∫ ( ) ( )|
( ) ∫ ( ) = ( ) ∫ ( ) ( )|
( ) ∫ ( ) ∫ ( ) ( )
Likewise, ( ) ∫ ( ) ∫ ( ) ( )
( ) ( )
{
( ) ∑ ( ) ∑ ( ) ( ) x=1, 2 and
( ) ∑ ( ) ∑ ( ) ( ), y=1, 2
( )
B. the conditional PMF of X given that Y is ( )
( )
2. Suppose that the two-dimensional continuous random variable (X, Y) has joint pdf
given by ( ) {
Determine the conditional pdf of X given that Y and the conditional pdf of Y given that X
Solution: To determine the conditional PDFs, we first evaluate the marginal pdf’ s, which are
given by
( ) ∫ ( ) ∫ ( ) ( )|
( ) ∫ ( ) ∫ ( ) ( )|
Hence,
( )
( )
( )
( )
( )
( )
1. Suppose (X, Y) are discrete random variables with probability function given by
Y x -1 0 1 Px(x)
-1 1/8 1/8 1/8
0 1/8 0 1/8
1 1/8 1/8 1/8
PY(y)
( ) ∫ ( ) ∫ ( ) ( )
Likewise, ( ) ∫ ( ) ∫ ( ) ( )
Note that there is a simple relationship between and moments about the origin that can
be used for the calculation of the covariance.
( ) ( ) ( )
Proof: This result follows directly from the properties of the expectation operation. In
particular, by definition
[( ( ))( ( ))]
, ( ) ( ) ( ) ( )-
( ) ( ) ( ) ( ) ( ) ( ) ( )
( ) ( ) ( )
Example: Let the bivariate random variable (X,Y) have a joint density function
( ) { Find ( ).
Example: Let X and Y be two random variables having a joint density function given by
( ) { .
Note this density implies that (x,y) points are equally likely to occur on and below the
parabola represented by the graph of . There is a direct functional dependence
between X and the range of Y, so that ( )will change as x changes and thus XandY must
be dependent random variables. None the less; . To see this, note that
Correlation coefficient tells us the degree of association and the direction of the linear
relationship between the random variables.
The correlation coefficient computed from the sample data measures the strength and
direction of a linear relationship between two variables.
The symbol for the sample correlation coefficient is r.
The symbol for the population correlation coefficient is ρ
, ( ) -
∫ ( ) ( ) ( )
, ( ) - ∑ ( ) ( ) ( )
, ( ) ( )- ∫ ∫ ( ) ( ) ( )
∫ ( ) ( ) ∫ ( ) ( )
, ( )- , ( )-
If X and Y are independent, Then Cov(X, Y) = 0
Example:
1. Suppose (X, Y) are discrete random variables with probability function given by
Y x -1 0 1 Px(x)
-1 1/8 1/8 1/8
0 1/8 0 1/8
1 1/8 1/8 1/8
PY(y)
( ) ∑ ( ) ( ) ∑ ( )
( ) ∑ ( ) ( ) ( ) ( )
( ) ∑ ( ) ( ) ( ) ( )
( ) ∑ ∑ ( ) [ ( )] [ ( )] [ ( )]
E(XY)=E(X).E(Y)=0=0.0
Therefore X and Y are independent variables