0% found this document useful (0 votes)
21 views8 pages

Joint Distribution XXXX

The document defines joint distribution for two-dimensional discrete and continuous random variables, explaining concepts such as joint probability mass function (PMF), joint probability density function (PDF), marginal probability functions, and conditional probability functions. It also discusses the independence of random variables and the correlation coefficient, which measures the linear relationship between two variables. Key mathematical relationships and definitions are provided for each concept.

Uploaded by

Tabassum Nuha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views8 pages

Joint Distribution XXXX

The document defines joint distribution for two-dimensional discrete and continuous random variables, explaining concepts such as joint probability mass function (PMF), joint probability density function (PDF), marginal probability functions, and conditional probability functions. It also discusses the independence of random variables and the correlation coefficient, which measures the linear relationship between two variables. Key mathematical relationships and definitions are provided for each concept.

Uploaded by

Tabassum Nuha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

JOINT DISTRIBUTION

DEFINITION
Two-dimensional Discrete and
Continuous Random Variables:
♦Discrete: A two-dimensional discrete random variable
(X, Y) takes on a countable number of pairs of values in a
two-dimensional space. Each pair (x, y) has an associated
probability P(X = x, Y = y).
♦Continuous: A two-dimensional continuous random
variable (X, Y) takes values in a continuous range in a
two-dimensional space. Its behavior is described by a
joint probability density function f(x, y), where the
probability of (X, Y) falling within a region A is given by
the double integral of f(x, y) over A.
Joint Probability Mass Function
(PMF) and Joint Probability
Density Function (PDF)
♦Joint PMF: For discrete random variables X and Y, the
joint PMF, denoted as P(X = x, Y = y), gives the
probability that X takes the value x and Y takes the value
y simultaneously.
♦Joint PDF: For continuous random variables X and Y, the
joint PDF, denoted as f(x, y), is a function such that the
probability of (X, Y) falling within a region R is given by
the double integral of f(x, y) over R.
Marginal Probability Function
♦The marginal probability function of X, denoted as
P_X(x) or f_X(x), gives the probability distribution of X
alone, regardless of the value of Y.
♦For discrete variables, P_X(x) = Σ_y P(X = x, Y = y).
♦For continuous variables, f_X(x) = ∫ f(x, y) dy.
♦Similarly, the marginal probability function of Y, P_Y(y)
or f_Y(y), is defined
Conditional Probability Function
♦The conditional probability function of X given Y = y,
denoted as P(X = x | Y = y) or f(x | y), gives the
probability distribution of X when Y is known to be y.
♦For discrete variables, P(X = x | Y = y) = P(X = x, Y = y)
/ P(Y = y).
♦For continuous variables, f(x | y) = f(x, y) / f_Y(y).
♦Similarly, the conditional probability function of Y given
X = x, P(Y = y | X = x) or f(y | x), is defined.
Independent Random Variables
♦Random variables X and Y are independent if the
occurrence of one does not affect the probability
distribution of the other.
♦Mathematically, X and Y are independent if and only if:
•P(X = x, Y = y) = P(X = x) * P(Y = y) for
discrete variables.
•F(x, y) = f_X(x) * f_Y(y) for continuous
variables.
•Or equivalently, P(X = x | Y = y) = P(X = x)
and P(Y = y | X = x) = P(Y = y).
Correlation Coefficient of X and
Y
♦The correlation coefficient, denoted as ρ or r, measures the
strength and direction of the linear relationship between two
random variables X and Y.
♦It is calculated as ρ = Cov(X, Y) / (SD(X) * SD(Y)), where
Cov(X, Y) is the covariance between X and Y, and SD(X) and
SD(Y) are the standard deviations of X and Y, respectively.
♦The value of ρ ranges from -1 to +1, where:
•ρ = +1 indicates a perfect positive linear relationship.
•Ρ = -1 indicates a perfect negative linear relationship.
•Ρ = 0 indicates no linear relationship.
THANK YOU!!!

You might also like