Conditional Distributions and Stochastic Independence
Conditional Distributions and Stochastic Independence
Conditional probability is the probability of one thing being true given that another thing is true.
It is a probability assigned to an event after receiving some information about other relevant events. We
discuss here how to update the probability distribution of a random variable X after observing the
realization of another random variable Y, i.e., after receiving the information that another random
variable Y has taken a specific value y. The updated probability distribution of X will be called the
conditional probability distribution of X given Y=y.
If 𝑋 𝑎𝑛𝑑 𝑌 are discrete random variables with joint PMF given by 𝑝(𝑥, 𝑦), then the conditional
probability mass function of 𝑋, given that 𝑌 = 𝑦, is denoted 𝑝𝑋|𝑌 (𝑥|𝑦) and given by:
• if 𝑝𝑌 (𝑦) = 0, then for that value of 𝑌 the conditional PMF of 𝑋 does not exist.
• if 𝑝𝑋 (𝑥) = 0, then for that value of 𝑋 the conditional PMF of 𝑌 does not exist.
If 𝑋 and 𝑌 are continuous random variables with joint pdf given by 𝑓(𝑥, 𝑦), then the conditional
probability density function (pdf) of 𝑋, given that 𝑌 = 𝑦, is denoted 𝑓𝑋|𝑌 (𝑥|𝑦) and given by:
𝑓(𝑥, 𝑦)
𝑓𝑋|𝑌 (𝑥|𝑦) =
𝑓𝑌 (𝑦)
Similarly, the conditional probability density function (pdf) of 𝑌, given that 𝑋 = 𝑥, is denoted
𝑓𝑌|𝑋 (𝑦|𝑥) and given by:
𝑓(𝑥, 𝑦)
𝑓𝑌|𝑋 (𝑦|𝑥) =
𝑓𝑋 (𝑥)
Prepared by: