0% found this document useful (0 votes)
171 views5 pages

Coditional Probability

The document discusses conditional probability in three main contexts: 1) Conditional probability is defined for probability spaces that are products of other probability spaces, and relates the joint probability to the marginal and conditional probabilities. 2) Conditional probability can also be defined for random variables on a probability space, relating the joint distribution of the variables to their marginal and conditional distributions. 3) Conditional probability can be used to define the probability of an event in one probability subspace conditioned on an event in another subspace, when the overall space is a product of the subspaces. Independence of subspaces is discussed in this context.

Uploaded by

A. Speltzu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
171 views5 pages

Coditional Probability

The document discusses conditional probability in three main contexts: 1) Conditional probability is defined for probability spaces that are products of other probability spaces, and relates the joint probability to the marginal and conditional probabilities. 2) Conditional probability can also be defined for random variables on a probability space, relating the joint distribution of the variables to their marginal and conditional distributions. 3) Conditional probability can be used to define the probability of an event in one probability subspace conditioned on an event in another subspace, when the overall space is a product of the subspaces. Independence of subspaces is discussed in this context.

Uploaded by

A. Speltzu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Conditional Probability

[email protected]

Conditional probability

Let   1 1   2 2  be a product space on which a probability P , that we


write as P1 2 , is defined, and let P1 and P 2 be the respective marginal probabilities.
We call probability on 1 conditioned by  2 a function P1 / 2 : 2  P 1 that
associates to each  2   2 a probability P1 / 2 2  on 1 , and so that, for all
function f :   R , P 1 2 -integrable, it is:

 f  ,    dP
1 2 1 2   dP2   f 1 , 2   dP1 / 2 2 

For which, by continuity, it is enough that :  dP1 2   dP2   dP1 /  2 2  .


A B B A

The conditional probability, in case of existing, is unique for almost all 2 on


each A  1 . In what follows we will say that two conditional probabilities are equal if
they are equal for almost all 2 on each A  1 . When 1  R n , this conditional
probability always exists (J.L. Doob, 1953).

In particular, in case of existing, we have:


 f 1   dP1   dP2   f 1   dP1 /2 2  .
We say that 1 and  2 are independent if : P1 2  A B  P ( A)  P (B) .
1 2

Then, it will be P1 / 2 2   P1 , and we write P1 2 as P1  P2 .

Relation between P1 /  2 3  and P1 / 2 / 3

In the case of the product of three spaces   1   2   3 it is possible to play


with different marginal and conditional probabilities according to we understand  like
1   2  3  or like 1   2    3 . It interests in that case of seeing the relation
among them. By example: P3 can be understood like the marginal of the marginal, in
the first case, or like the simple marginal, in the second. Also P 2 3 can be understood
like the conditional of the marginal, in the first case, or like the marginal of the
conditional, in the second. The ambiguity in the previous cases does not exist because
these probabilities agree in both cases.

 dP  dP
C
3
B
2 / 3 (3 )   dP3 
C

1  B
dP( 1 2 )/ 3 (3 )  P2 3 (B  C)

1
In the equalities that follow appear the previous probabilities in both senses,
and they allow us to infer the equality, for almost all  2 ,  3  , of P1 /  2 3  and
P1 / 2 / 3 , in case of existing this last one.

P ( A  B  C )   dP2 3   dP1 /  2 3  2 , 3    dP   dP /  3    dP / 


3 2 3 1 2 3  2 , 3 
BC A C B A

P ( A  B  C )   dP3   dP( 1 2 )/ 3 3    dP3   dP2 / 3 3    dP1 / 2 / 3 2 , 3 


C A B C B A

In general, if P is a probability on 3 , Q is a probability on  2 for each 3 ,


and R is a probability on 1 for each (2 , 3 ) such that :
P ( A  B  C )   dP   dQ 3    dR(2 , 3 )
C B A

then P  P3 , Q(3 )  P2 / 3 (3 ) and R(2 , 3 )  P1 / 2 / 3 (2 , 3 )  P1 / ( 2 3 ) (2 , 3 ) .

Probability P1 /  2 .

Let P be the probability defined on     and 1 , 2   . Let us consider the


space product   1    2  with the probability P1 2 defined by:
P1 2  A  B   P  A  B  . The marginal probabilities P agree here with P . We i

call, in this case, to the corresponding conditional probability “Probability on   1 


conditioned by   2  ” and we represent it by P1 / 2 .

P1 2 is concentrated on the diagonal as shown by the following formula:


 f ( ,  ) dP
1 2 1  2   f ( ,  ) dP ( 1 2 )

In case of being  and   B  we write P / B . P /B  A must be of the form


 I B   I B , because these are the only  B -measurable functions, therefore we have:
P  A  B     I B   I B   dP    P B  ,
B

P  A  B  P A  B 
whence  and, in the same way,   ;
P B  P B 
therefore

P  A  B  P  A  B 
P / B   A  I B    I B  
P  B  P  B 

2
Also, it is curious that:
 P( 1 2 )/ 3 ( )( A  B)  P / 3 ( )( A  B)
 P / ( B,C )    P / B /C (,  ) . And, in general: P  ( 1 2 )    P / / (, )1 2

 P / B/ B 1 , 2   P / B 2  . And, in general: P / 1 / 1 1 , 2   P /  2 1

Conditioned by  ,  is independent of  .

We also say that 1 is independent of (2 , 3 ,.., n ) if   1  is


independent of   2     3   ...  n  , that is, P1 2 ....  n  P1  P2 .... n . And we
will say that they are mutually independent if P1 2 .... n  P1  P2  ...  Pn . And we
will say the same about sets families if their  -algebras are like that.

Relation between P1  2 ....  n and P .


If A  n , we call   A   / , ,...,    A , it is easy to see that :
   1 2  ...n      1 2  ... n  and P 1 2 .... n  A  P   A .

And so, 1 is independent of (2 , 3 ) if , and only if , it is of   2 3  , we


can prove it in two different ways:

 P1    2 3   A    B    P  A    B    P   A  B    P
  1  2 3  A B 
 P  A  P
  B   P  A   P   B  
 2 3  

 P1  (  2 3 )  A  ( B  C )   P    A  B  C   P  A  P
1 2 3 2 3 (B  C ) 
 P  A  P  B  C 

Probability of X conditioned by Y
If X , Y are random variables defined on a same space     with probability
P , these variables determine in R (or in spaces 1 and  2 , in general) individual
probabilities PX and PY respectively. As before, we can define on R 2 (or in 1  2 ) a
joint probability given by:

PX Y  A B   P  X  A Y  B   P  X ,Y   A B  ,
where the marginal probabilities agree with PX and PY . We call to the corresponding
conditional one “Probability of X conditioned by Y ” and we represent it by PX /Y .

Here, as in the previous case, we have:


 f ( x, y) dPX Y   f ( X ( ), Y ()) dP ( X ,Y )

3
Let us calculate, like example, PX / X 2 , being f x the function of density of X :
PX X2
 A B 
 PX A  B  PX A   B      f  x   dx   f  x   dx 
A B A B

  
f x   f x 
   f  x   f   x  I  x   f  x   f   x  I   x     f  x   f   x    dx 
A A
B  
 f  x f x 
   I A  x  I A   x    dPX 2
B
f  x  f x f  x  f x 

It must be therefore:
 y  I y  f  y  I  y
f
PX / X 2  y  A 
f  y   f  y 
  f y f  y  
    A A

And on the contrary: PX 2 / X  x  B   I B  x2 

In the case of being three variables X , Y , Z it is, in the same previous way,
PX /(Y Z )  PX / Y / Z . And if Z is independent of  X , Y  , it is PX /Y / Z  PX /Y because:


A BC
dPX Y Z   dPZ 
C

A B
dPX Y   dPZ   dPY   dPX /Y
C B A

Also, if Z  Y , we can obtain the formula for a bidimensional variable


conditioned by one of its coordinates: P( X Y )/ Y
 
P   X , Y   A  B  Y  C   P  X  A  Y   B  C     dPY   dPX /Y   dPY   I B   dPX /Y 
B C A C  A 

whence:
P( X Y )/ Y  y  A  B   I B  y   PX /Y  y  A

In this last expression we can see that X / Y is independent of Y / Y and so:


PX / Y / Y  PX / Y

Here, we have the relation: PX1 X 2 ... X n  A  P  X1, X 2 ,..., X n   A . And if


f : R 2  R is a measurable function and X is independent of X1 , X 2 , then X is also
independent of Y  f  X1 , X 2  :

 A  B   P  X  A  Y  B   P  X  A   X 1 , X 2   f 1  B   
PX Y

 P   X , X 1 , X 2   A  f 1  B    PX X X  A  f 1  B    PX  A   PX X  f 1  B   
1 2 1 2

 P  X  A   P   X 1 , X 2   f 1  B    P  X  A   P Y  B   PX  A   PY  B 

4
We can also deduce two well-known conditional expectation formulas, and see
that conditioned expectation from the point of view treated here, if it exists, coincides
with the classical one.

 E  X  I B (Y )  E  E  X | Y   I B (Y )  , because:

I B ( y )  x dPX Y   I B ( y )dPY   x dPX /Y

 E  X | Z   E  E[ X | Y , Z ] | Z  , because:

 x dP X /Z   dPY / Z   x dPX / Y Z   dPY Z /Z   x dPX / Y Z

Casuistry of the conditional probability

Certainly it is possible to speak of other many types of conditional probability


and, without trying to make an exhaustive analysis of all of them, it is clear that all
arise in a natural way from the first definition of P1 /  2 . Thus, for example, if like
before X and Y are defined variable on a space  with probability P , we can
construct, aside from the previous ones, different spaces product: R X    Y  ,
  RY , etc. On this product spaces we can define probabilities in a natural way,
from the probability P , and the respective conditional probabilities: PX / Y  , that to
each    it associates a probability on R ; P /Y , that to each real number associates
a probability on  ; etc. These probabilities are related on a intuitive way, thus:

* PX /Y  A  P /Y  X  A and, in the same way, PX / Y   A  P / Y   X  A , since:


P  X  A  Y  B    dPY   dPX /Y   dPY   dP /Y .
B A B X A

* PX / Y   PX /Y Y and, on similar way, P / Y   P /Y Y , since:


P  X  A  Y  B    dPY   dPX /Y   dP (Y )   d  PX /Y Y    dP (Y )   dPX / Y  .
B A Y B A Y B A

A last problem

Let X be a random variable, let  be the σ-algebra generated by the intervals


that contain the positive reals ( x  0) , and let  be the σ-algebra generated by the
intervals that contain the negative reals ( x  0) . Determine PX / /  ( y, z )( A) .

You might also like