Conditional Probability Distribution
Conditional Probability Distribution
In probability theory and statistics, given two jointly distributed random variables and , the
conditional probability distribution of given is the probability distribution of when is known
to be a particular value; in some cases the conditional probabilities may be expressed as functions
containing the unspecified value of as a parameter. When both and are categorical variables, a
conditional probability table is typically used to represent the conditional probability. The conditional
distribution contrasts with the marginal distribution of a random variable, which is its distribution without
reference to the value of the other variable.
If the conditional distribution of given is a continuous distribution, then its probability density function
is known as the conditional density function.[1] The properties of a conditional distribution, such as the
moments, are often referred to by corresponding names such as the conditional mean and conditional
variance.
More generally, one can refer to the conditional distribution of a subset of a set of more than two variables;
this conditional distribution is contingent on the values of all the remaining variables, and if more than one
variable is included in the subset then this conditional distribution is the conditional joint distribution of the
included variables.
Due to the occurrence of in the denominator, this is defined only for non-zero (hence strictly
positive)
Example
Consider the roll of a fair die and let if the number is even (i.e., 2, 4, or 6) and otherwise.
Furthermore, let if the number is prime (i.e., 2, 3, or 5) and otherwise.
D 1 2 3 4 5 6
X 0 1 0 1 0 1
Y 0 1 1 0 1 0
Then the unconditional probability that is 3/6 = 1/2 (since there are six possible rolls of the dice, of
which three are even), whereas the probability that conditional on is 1/3 (since there are
three possible prime number rolls—2, 3, and 5—of which one is even).
where gives the joint density of and , while gives the marginal density for . Also
in this case it is necessary that .
The concept of the conditional distribution of a continuous random variable is not as intuitive as it might
seem: Borel's paradox shows that conditional probability density functions need not be invariant under
coordinate transformations.
Example
Relation to independence
Random variables , are independent if and only if the conditional distribution of given is, for all
possible realizations of , equal to the unconditional distribution of . For discrete random variables this
means for all possible and with . For continuous
random variables and , having a joint density function, it means for all
possible and with .
Properties
Seen as a function of for given , is a probability mass function and so the sum over
all (or integral if it is a conditional probability density) is 1. Seen as a function of for given , it is a
likelihood function, so that the sum over all need not be 1.
Additionally, a marginal of a joint distribution can be expressed as the expectation of the corresponding
conditional distribution. For instance, .
Measure-theoretic formulation
Let be a probability space, a -field in . Given , the Radon-Nikodym theorem
[3]
implies that there is a -measurable random variable , called the conditional
probability, such that
for every , and such a random variable is uniquely defined up to sets of probability zero. A
conditional probability is called regular if is a probability measure on for all
a.e.
Special cases:
For the trivial sigma algebra , the conditional probability is the constant function
For a real-valued random variable (with respect to the Borel -field on ), every conditional
probability distribution is regular.[4] In this case, almost surely.
An expectation of a random variable with respect to a regular conditional probability is equal to its
conditional expectation.
See also
Conditioning (probability)
Conditional probability
Regular conditional probability
Bayes' theorem
References
Citations
1. Ross, Sheldon M. (1993). Introduction to Probability Models (Fifth ed.). San Diego:
Academic Press. pp. 88–91. ISBN 0-12-598455-3.
2. Park,Kun Il (2018). Fundamentals of Probability and Stochastic Processes with Applications
to Communications. Springer. ISBN 978-3-319-68074-3.
3. Billingsley (1995), p. 430
4. Billingsley (1995), p. 439
Sources
Billingsley, Patrick (1995). Probability and Measure (https://books.google.com/books?id=a3g
avZbxyJcC) (3rd ed.). New York, NY: John Wiley and Sons.