0% found this document useful (0 votes)
13 views11 pages

0 BN Probability

The document discusses Bayesian networks and probabilistic reasoning. It introduces concepts like probabilistic models, events, conditional probabilities, and variable independence. Examples are provided to illustrate diagnosis with one or two tests, conditional independence, and explaining away.

Uploaded by

Soban
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views11 pages

0 BN Probability

The document discusses Bayesian networks and probabilistic reasoning. It introduces concepts like probabilistic models, events, conditional probabilities, and variable independence. Examples are provided to illustrate diagnosis with one or two tests, conditional independence, and explaining away.

Uploaded by

Soban
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Bayesian Networks:

Probability Theory and Probabilistic


Reasoning

Javier Larrosa

Computing and Intelligent Systems


Master en Enginyeria Informàtica
Spring 2013, Barcelona
Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 1


What is a probabilistic model?

A set Z of mutually exclussive outcomes (Z is usually defined


as the crossed product of a set of random variables)
A probability measure P : Z −→ R that satisfies the following
axioms:
1. 0 ≤ P(z) ≤ 1 for every z ∈ Z
2. ∑z∈Z P(z) = 1

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 2


Events

An event is a set of outcomes


We will consider events described in propositional logic
A proposition a specifies the set of possible worlds a ⊆ Z in
which the proposition holds
The semantics is straightforward:
a ∨ b = Union of outcomes from a and b
a ∧ b = Intesection of outcomes from a and b
¬a = Set of outcomes not in a

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 3


Refresh your memory

Prior probability: P(a) = ∑z∈a P(z)


Inclusion-exclusion principle: P(a ∨ b) = P(a) + P(b) − P(a ∧ b)
Notation: We will write P(a, b) as a short-hand of P(a ∧ b)
Conditional or posterior probabilities: P(a|b) refer to degrees of
belief of a given b.

P(a, b)
P(a|b) =
P(b)
Product rule: P(a, b) = P(a|b)P(b)
Chain rule:
P(a1 , a2 , . . . , an ) = P(a1 |a2 , . . . , an )P(a2 |a3 , . . . , an ) · · · P(an )

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 4


Refresh your memory

P( a)
Bayes rule: P(a|b) = P( b)
P(b|a)
Law of total probability: Let b1 , b2 , . . . , bn be a set of mutually
exclusive and exhaustive events. Then,
n
P(a) = ∑ P(a, bi )
i=1

Equivalently,
n
P(a) = ∑ P(a|bi )P(bi )
i=1

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 5


Refresh your memory

a and b are independent (noted a⊥b) if P(a|b) = P(a)


Equivalently, P(a, b) = P(a)P(b)
a and b are independent given c (a⊥b|c) if P(a|b, c) = P(a|c)
Equivalently, P(a, b|c) = P(a|c)P(b|c)
Warning: a⊥b|c does not imply a⊥b
Warning: a⊥b does not imply a⊥b|c

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 6


Variable independence

Variables A and B are independent (noted A⊥B) if P(a|b) = P(a)


for all possible instantiation a, b of A, B.
Variables A and B are independent given C (noted A⊥B|C) if
P(a|b, c) = P(a|c) for all possible instantiation a, b, c of A, B,C.
Observation: Variable independence is stronger than event
independence

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 7


Diagnosis

Example: Diagnosis of Cancer (C ∈ {c, ¬c}, T ∈ {t, ¬t})


P(c) = .01, P(t|c) = .9, P(t|¬c) = .2,
P(¬c) = P(¬t|c) = P(¬t|¬c) =

P(C, T ) = P(T |C )P(C )


P(T |C )
P(C|T ) = P(C ) P(T )
P(c|t ) = .04
P(c|¬t ) = .001
P(t ) = .207

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 8


Two independent tests diagnosis

Example: Diagnosis of Cancer with two tests


(C ∈ {c, ¬c}, Ti ∈ {ti , ¬ti }).
We assume T1 and T2 conditinally independent (T1 ⊥T2 |C)
P(c) = .01, P(ti |c) = .9, P(ti |¬c) = .2,
P(¬c) = .99, P(¬ti |c) = .1, P(¬ti |¬c) = .8,

P(C, T1 , T2 ) = P(T1 |C, T2 )P(T2 |C )P(C ) = P(T1 |C )P(T2 |C )P(C )


P(c|t1 ,t2 ) = .16
P(c|t1 ) = .04
P(t1 ) = .207
P(t1 |t2 ) = .23
Note that T1 and T2 are NOT independent

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 9


Conditional Independency Example

Example: Three days weather


Si ∈ {si , ¬si } (wheather at day 1 ≤ i ≤ 3 may be sunny or not)
We assume S1 ⊥S3 |S2
P(s1 ) = .9, P(si |si−1 ) = .95, P(si |¬si−1 ) = .6
P(¬s1 ) = .1, P(¬si |si−1 ) = .2, P(¬si |¬si−1 ) = .4

P(S1 , S2 , S3 ) = P(S3 |S1 , S2 )P(S2 |S1 )P(S1 ) = P(S3 |S2 )P(S2 |S1 )P(S1 )
P(s1 ) = .90
P(s2 ) = .91
P(s3 ) = .92
P(s3 |s1 ) = .93
P(s1 |s3 ) = .91

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 10


Explaining Away

Example: I am happy (H) when it is sunny (S) or when I get a salary


raise (R).
We assume S⊥R
P(s) = .7, P(r ) = .01, P(h|s, r ) = 1, P(h|¬s, r ) = .9,
P(h|s, ¬r ) = .7, P(h|¬s, ¬r ) = .1

P(H, S, R) = P(H|S, R)P(S, R) = P(H|S, R)P(S)P(R)


P(r|h) = .018
P(r|h, s) = .014
Note that R and S are NOT independent given H

Departament de Llenguatges i Sistemes Informatics‘


UNIVERSITAT POLITECNICA DE CATALUNYA
UPC

Bayesian Networks:Probability Theory and Probabilistic Reasoning – p. 11

You might also like