0% found this document useful (0 votes)
70 views2 pages

1.5.1 Law of Total Probability and Bayes Formula

1. The document introduces the law of total probability and Bayes' formula. It defines conditional probability and shows that the probability of events A and B intersecting can be written as the probability of A given B multiplied by the probability of B. 2. It proves the multiplicative law for 3 events using conditional probabilities and the additive law for 2 events using disjoint sets. 3. It states the law of total probability, which expresses the probability of an event B as the sum of the probabilities of B given mutually exclusive and exhaustive events Ai, each multiplied by the probability of Ai. 4. It derives Bayes' formula by applying the law of total probability to the definition of conditional probability. The formula gives the

Uploaded by

sdklsjdflkajs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views2 pages

1.5.1 Law of Total Probability and Bayes Formula

1. The document introduces the law of total probability and Bayes' formula. It defines conditional probability and shows that the probability of events A and B intersecting can be written as the probability of A given B multiplied by the probability of B. 2. It proves the multiplicative law for 3 events using conditional probabilities and the additive law for 2 events using disjoint sets. 3. It states the law of total probability, which expresses the probability of an event B as the sum of the probabilities of B given mutually exclusive and exhaustive events Ai, each multiplied by the probability of Ai. 4. It derives Bayes' formula by applying the law of total probability to the definition of conditional probability. The formula gives the

Uploaded by

sdklsjdflkajs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

1.

5 Lecture 3
1.5.1 Law of Total Probability and Bayes Formula

Given events A and B, recall that the probability of B given A is

P(A B)
P(B|A) =
P(A)

and A and B are independent if P(A B) = P(A)P(B).


Lemma 1.21. 1. The multiplicative law: given events A and B then

P(A B) = P(A|B)P(B),

and similarly, if you have events A, B, C then

P(A1 A2 A3 ) = P(A3 |A2 A1 )P(A2 |A1 )P(A1 )

2. The additive law: Let A and B be events then

P(A B) = P(A) + P(B) P(A B)

Proof. 1. To proof the multiplicative law for sets A, B and C, one simply apply consecutively the
denition of conditional probability and write

P(A1 A2 A3 ) = P(A1 |A2 A3 )P(A2 A3 )


= P(A1 |A2 A3 )P(A2 |A3 )P(A3 )

2. To proof the additive law for two sets, we write AB = A(Ac B) and B = (AB)(Ac B)
which are both the union of disjoint sets. Then from Lemma 1.12. 1. we have

P(A B) = P(A) + P(Ac B)


P(B) = P(A B) + P(Ac B) = P(Ac B) = P(B) P(A B)

and by the second equation into the rst equation, we have P(A B) = P(A) + P(B) P(A B).
Remark 1.22. Recall that one use to multiply down the tree diagram. The RHS of the multiplica-
tive law is exactly what you are computing when doing that.
Lemma 1.23. (Law of Total Probability) Suppose (Ai )i=1,...,k are mutually exclusive and exhaustive
k
of , that is i=1 Ai = , then for any event B, we have
k

P(B) = P(B|Ai )P(Ai )
i=1

Proof. It is easy to see that B = B and by using the fact that (Ai )i=1,...k is exhaustive of , we
can writ e

B =B
k

=B Ai
i=1
k

(By distributive law) = (B Ai )
i=1

9
Then by noticing that (B Ai )i=1,...,k are again disjoint sets and using the denition of conditional
probability, we have
 k k
P(B) = P(B Ai ) = P(B|Ai )P(Ai )
i=1 i=1

and the result is proven.


Lemma 1.24. (Bayes Formula) Given sets B, A and a family of disjoint and exhaustive sets
(Ai )i=1,...,k then
P(B|A)P(A)
P(A|B) = k
i=1 P(B|Ai )P(Ai )

Proof. From denition of conditional probability


P(A B)
P(A|B) =
P(B)
P(B|A)P(A)
=
P(B)

then by applying the law of total probability to P(B) in the denominator, we have

P(B|A)P(A)
P(A|B) = k
i=1 P(B|Ai )P(Ai )

and this gives us the formula.


Example 1.25. (Applications of Bayes Formula) A diagnostic test for a certain disease claims to
be 90% accurate in the following sense.

If the patient has the disease, the the test will be shown positive with probability 0.9.
If the patient does not have disease, the the test will show negative with probability 0.9.

Also we know that 1% of the population has the disease.


Let A = {Person has the disease} and B = {Person tests positive}, and we are interested in
computing the probability P(A|B). Intuitively, P(A|B) is of interested because for a patient to
decide on whether or not to have the appropriate treatment, he/she must decide whether or not
he/she is really sick given that the test was positive.
Notice that A and Ac is exhaustive, since one can only be sick or not. Then by Lemma 1.24
(Bayes formula) we have

P(B|A)P(A)
P(A|B) =
P(B|A)P(A) + P(B|Ac )P(Ac )
One needs only to compute the right hand side.
From the problem, we know that P(B|A) = 0.9, P(A) = 0.01, P(Ac ) = 0.01 and P(B c |Ac ) = 0.9.
To compute P(B|Ac ), we only need to notice that

P(B c |Ac ) + P(B|Ac ) = 1 (which can proven from the denition of conditional probability).

By putting the numbers together we see that


1
P(A|B) = !!!
12
Moral of the story: Never trust what your doctor tells you!

10

You might also like