0% found this document useful (0 votes)
33 views12 pages

MATERIAL09

This document provides an overview of key concepts in joint probability distributions for continuous random variables, including: 1. It defines joint probability density functions (PDFs) and their properties, as well as how to calculate marginal and conditional PDFs from joint PDFs. 2. It introduces the joint cumulative distribution function and how independence is defined for bivariate random variables. 3. It explains how to calculate the covariance and correlation between two random variables from their joint distribution in order to assess their relationship when they are not independent. 4. It provides an example problem calculating properties of the joint PDF between two random variables X and Y.

Uploaded by

Oscar yaucob
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views12 pages

MATERIAL09

This document provides an overview of key concepts in joint probability distributions for continuous random variables, including: 1. It defines joint probability density functions (PDFs) and their properties, as well as how to calculate marginal and conditional PDFs from joint PDFs. 2. It introduces the joint cumulative distribution function and how independence is defined for bivariate random variables. 3. It explains how to calculate the covariance and correlation between two random variables from their joint distribution in order to assess their relationship when they are not independent. 4. It provides an example problem calculating properties of the joint PDF between two random variables X and Y.

Uploaded by

Oscar yaucob
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

9th Material Subject: Joint Probability

Distribution (Continuous)
Undergraduate of Telecommunication Engineering
MUH1F3 - PROBABILITY AND STATISTICS

Telkom University
Center of eLearning & Open Education Telkom University
Jl. Telekomunikasi No.1, Bandung - Indonesia
http://www.telkomuniversity.ac.id

Lecturer: Nor Kumalasari Caecar Pratiwi, S.T., M.T. ([email protected])


TABLE OF CONTENTS:

1. Joint Probability Density Functions


2. Marginal Probability Density Functions
3. Conditional Probability Distribution
4. Covariance and Correlation

LEARNING OBJECTIVES:

After careful study of this chapter, student should be able to do the following:
1. Use joint probability density functions to calculate probabilities
2. Calculate marginal and conditional probability distributions from joint probability distributions
3. Interpret and calculate covariance and correlations between random variables
2/12 May 10, 2020 LECTURER CODE: NKC/
JOINT PROBABILITY DENSITY FUNCTION

For simplicity, we begin by considering random experiments in which only two random variables, called Bi-
variate. The Joint Probability Density Function of the continuous random variables X and Y, denoted as
fXY (xy), satisfies:
fXY (xy) ≥0 (1)
Z ∞ Z ∞
fXY (xy) dx dy =1 (2)
−∞ −∞

fXY (xy) = P(X = x and Y = y) = P(X = x) ∩ P(Y = y) (3)

3/12 May 10, 2020 LECTURER CODE: NKC/


MARGINAL PROBABILITY DENSITY FUNCTION

The Marginal Probability Density Function of the continuous random variables X and Y, denoted as fX (x)
or fY (y), satisfies: Z
fX (x) = P(X = x) = fXY (x, y) dy (4)

Z
fY (y) = P(Y = y) = fXY (x, y) dx (5)

4/12 May 10, 2020 LECTURER CODE: NKC/


JOINT CUMULATIVE DISTRIBUTION FUNCTION

Remember that, for a random variable X, we define the CDF as FX (x) = P(X ≤ x). Now, if we have two ran-
dom variables X and Y and we would like to study them jointly, we can define the Joint Cumulative Function
as follows:
FXY (x, y) = P(X ≤ x and Y ≤ y) = P(X ≤ x) ∩ P(Y ≤ y) (6)

5/12 May 10, 2020 LECTURER CODE: NKC/


INDEPENDENT BIVARIATE

The random variable X and Y become independent, if only:

fXY (x, y) = P(X = x) · P(Y = y) = fX (x) · fY (y) (7)

or:
FXY (x, y) = P(X ≤ x) · P(Y ≤ y) = FX (x) · FY (y) (8)

6/12 May 10, 2020 LECTURER CODE: NKC/


COVARIANCE AND CORRELATION

When two random variables X and Y are not independent, it is frequently of interest to assess how strongly
they are related to one another. The Covariance between two random variables X and Y equal to:

Cov(XY) = E(XY) − E(X) · E(Y) (9)

Where, the joint expectation should be:


Z Z
E(XY) = x · y · fXY (xy) dx dy (10)

The Correlation Coefficient of X and Y, equal to:

Cov(XY)
Cor(XY) = ρXY = (11)
σx · σy

7/12 May 10, 2020 LECTURER CODE: NKC/


EXAMPLE

Example: Suppose that X and Y are two continuous random variable with joint PDF:
(
c (x + y) , for 0 <x<3 and 0 <y<3
fXY (xy) =
0 , for x and y otherwise

a. Determine the value of c

b. Determine the marginal PDF of X

c. Determine the marginal PDF of Y

d. Determinan the P(1 < x < 2)


e. Determinan the P(x ≥ 1)
f. Determinan the P(y ≤ 2.5)

8/12 May 10, 2020 LECTURER CODE: NKC/


Answer:
R∞ R∞
a. The value of c must qualify the joint pdf that −∞ −∞ fXY (xy) dx dy =1
Z 3Z 3
Z 3  Z 3   
cx2 y 3 9cy 9cy2 3
cxy dx dy =1 → 0
dy =1 → dy =1 → 0
dy =1
0 0 0 2 0 2 4

81c 4
=1 → c =
4 81
b. The marginal PDF of X
3
Z
4 2 2 3 2
fX (x) = xy dy = xy = x
0
0 81 81 9
So, the marginal PDF for X is: (
2
9
x ,0 < x < 3
fX (x) =
0 , otherwise
9/12 May 10, 2020 LECTURER CODE: NKC/
c. The marginal PDF of Y
3
Z
4 2 2 3 2
fY (y) = xy dx = x y = y
0
0 81 81 9
So, the marginal PDF for Y is: (
2
9
y ,0 < y < 3
fY (y) =
0 , otherwise
d. The P(1 < x < 2)
2
Z
2 1 2 2 3
P(1 < x < 2) = x dx = x =
1
1 9 9 9

e. The P(x ≥ 1)
3
Z
2 1 2 3 8
P(x ≥ 1) = P(1 < x < 3) = x dx = x
1
=
1 9 9 9

10/12 May 10, 2020 LECTURER CODE: NKC/


f. The P(y ≤ 2.5)
2.5
Z
2 1 2 2.5 6.25
P(y ≤ 2.5) = P(0 < y < 2.5) = y dy = y
0
=
0 9 9 9

11/12 May 10, 2020 LECTURER CODE: NKC/


Thank You

12/12 May 10, 2020 LECTURER CODE: NKC/

You might also like