0% found this document useful (0 votes)
83 views70 pages

Computer Science CPSC 322: Bayesian Networks: Construction

This document provides an overview of Bayesian networks and their construction. It discusses how the chain rule and conditional independence allow Bayesian networks to compactly represent joint probability distributions. The chain rule decomposes a joint distribution into a product of conditional distributions. Conditional independence further simplifies distributions by allowing some conditions to be omitted. Bayesian networks graphically represent dependencies between variables using a directed acyclic graph and define conditional probability distributions for each variable given its parents.

Uploaded by

minemine
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views70 pages

Computer Science CPSC 322: Bayesian Networks: Construction

This document provides an overview of Bayesian networks and their construction. It discusses how the chain rule and conditional independence allow Bayesian networks to compactly represent joint probability distributions. The chain rule decomposes a joint distribution into a product of conditional distributions. Conditional independence further simplifies distributions by allowing some conditions to be omitted. Bayesian networks graphically represent dependencies between variables using a directed acyclic graph and define conditional probability distributions for each variable given its parents.

Uploaded by

minemine
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 70

Computer Science CPSC 322

Lecture 20
Bayesian Networks:
Construction

1
Lecture Overview

• Recap lecture 19
• Bayesian networks: construction
• Defining Conditional Probabilities in a Bnet
• Considerations on Network Structure (time
permitting)

2
Chain Rule
• Allows representing a Join Probability Distribution
(JPD) as the product of conditional probability
distributions

Theorem: Chain Rule


𝑛𝑛

𝑃𝑃(𝑓𝑓1⋀ … ⋀𝑓𝑓𝑛𝑛) = � 𝑃𝑃(𝑓𝑓𝑓𝑓|𝑓𝑓𝑖𝑖 − 1 ⋀ … ⋀𝑓𝑓1)


𝑖𝑖=1

3
Chain Rule example
𝑛𝑛

𝑃𝑃(𝑓𝑓1⋀ … ⋀𝑓𝑓𝑛𝑛) = � 𝑃𝑃(𝑓𝑓𝑓𝑓|𝑓𝑓𝑖𝑖 − 1 ⋀ … ⋀𝑓𝑓1)


𝑖𝑖=1

P(A,B,C,D)
= P(D|A,B,C) × P(A,B,C) =
= P(D|A,B,C) × P(C|A,B) × P(A,B)
= P(D|A,B,C) × P(C|B,A) × P(B|A) × P(A)
= P(A)P(B|A)P(C|A,B)P(D|A,B,C)
4
Why does the chain rule help us?
We will see how, under specific circumstances (variables
independence), this rule helps gain compactness

• We can represent the JPD as a product of marginal


distributions
• We can simplify some terms when the variables
involved are marginally independent or conditionally
independent

5
Marginal Independence

• Intuitively: if X ╨ Y, then
• learning that Y=y does not change your belief in X
• and this is true for all values y that Y could take

• For example, weather is marginally independent


of the result of a coin toss 6
Exploiting marginal independence
• Recall the product rule
p(X=x ˄ Y=y) = p(X=x | Y=y) × p(Y=y)
• If X and Y are marginally independent,
p(X=x | Y=y) = p(X=x)
• Thus we have
p(X=x ˄ Y=y) = p(X=x) × p(Y=y)
• In distribution form
p(X,Y) = p(X) × p(Y)

7
Exploiting marginal independence

Exponentially fewer than the JPD!

8
Given the binary variables A,B,C,D,
To specify P(A,B,C,D) one needs the JDP below To specify P(A)×P(B) ×P(C)×P(D)
one needs the JDPs below
A B C D P(A,B,C,D)
T T T T A P(A)
T T T F T
T T F T F
T T F F
T F T T B P(B)
T F T F
T
T F F T
F
T F F F
F T T T C P(C)
F T T F T
F T F T
F
F T F F
F F T T D P(D)
F F T F
T
F F F T
F
F F F F 9
Conditional Independence

• Intuitively: if X and Y are conditionally independent given Z,


then
• learning that Y=y does not change your belief in X
when we already know Z=z
• and this is true for all values y that Y could take
and all values z that Z could take
10
Example for Conditional Independence
• Whether light l1 is lit (Lit-l1 ) and the position of switch s2
(Up-s2 ) are not marginally independent
• The position of the switch determines whether there is
power in the wire w0 connected to the light
Up-s2

Lit-l1

• However, whether light l1 is lit is conditionally independent from the


position of switch s2 given whether there is power in wire w0 (Power-w0)
• Once we know Power-w0, learning values for Up-s2 does not change our
beliefs about Lit-l1
• I.e., Lit-l1 is conditionally independent of Up-s2 given Power-w0

Up-s2

Power-w0

Lit-l1
11
Conditional vs. Marginal Independence
Two variables can be
Understood
Material
Conditionally but not marginally independent Assignment Exam
• ExamGrade and AssignmentGrade Grade Grade

• ExamGrade and AssignmentGrade given UnderstoodMaterial


Up-s2
• Lit-l1 and Up-s2
Power_w0
• Lit-l1 and Up-s2 given Power_w0
Lit_l1

Marginally but not conditionally independent Smoking Fire


At Sensor
• SmokingAtSensor and Fire
• SmokingAtSensor and Fire given Alarm Alarm

Both marginally and conditionally independent Power_w0


Canucks Win
• CanucksWinStanleyCup and Lit_l1
• CanucksWinStanleyCup and Lit_l1 given Power_w0 Lit_l1

Neither marginally nor conditionally independent


Cloudiness Wind
• Temperature and Cloudiness
• Temperature and Cloudiness given Wind Temperature 12
Exploiting Conditional Independence
Example 2: Boolean variables A,B,C,D
• D is conditionally independent of both A and B given C
 We can rewrite P(D | A,B,C) as P(D|C)
• P(D|C) is much simpler to specify than P(D | A,B,C) !

13
If A, B, C, D are Boolean variables
P(D | A,B,C) is given by the following table
A B C P(D=T|A,B,C) P(D=F|A,B,C)
T T T
T T F
T F T
T F F
F T T
F T F
F F T
F F F

8 – each row represents the probability distribution for D given


the values that A, B and C take in that row
P(D|C) is given by the following table
2 – each row represents the
C P(D=T|C) P(D=F|C)
probability distribution for D given
the value that C takes in that row
T
F 14
Putting It All Together
• Given the JPD P(A,B,C,D),
we can apply the chain rule to get
P( A, B, C , D) = P( A) × P( B | A) × P(C | A, B) × P( D | A, B, C )

• If D is conditionally independent of A and B given C, we


can rewrite the above as
P( A, B, C , D) = P( A) × P( B | A) × P(C | A, B) × P( D | C )

Under independence we gain compactness (fewer/smaller


distributions to deal with)
• The chain rule allows us to write the JPD as a product of conditional
distributions
• Conditional independence allows us to write them more compactly
15
Bayesian (or Belief) Networks

• Bayesian networks and their extensions are


Representation & Reasoning systems
explicitly defined to exploit independence in
probabilistic reasoning

16
Bayesian Networks: Intuition

• A graphical representation for a joint probability distribution


• Nodes are random variables
• Directed edges between nodes reflect dependence
Up-s2
• Some informal examples:
Power-w0

Lit-l1

Understood
Material Smoking
Fire
At Sensor
Assignment Exam
Grade Grade
Alarm
17
Belief (or Bayesian) networks
Def. A Belief network consists of
• a directed, acyclic graph (DAG) where each node is associated
with a random variable Xi
• A domain for each variable Xi
• a set of conditional probability distributions for each node Xi given
its parents Pa(Xi) in the graph
P (Xi | Pa(Xi))

• The parents Pa(Xi) of a variable Xi are those Xi directly


depends on
• A Bayesian network is a compact representation of the
JDP for a set of variables (X1, …,Xn )
P(X1, …,Xn) = ∏ni= 1 P (Xi | Pa(Xi))
18
Lecture Overview

• Recap lecture 19
• Bayesian networks: construction
• Defining Conditional Probabilities in a Bnet
• Considerations on Network Structure (time
permitting)

19
How to build a Bayesian network
1. Define a total order over the random variables: (X1, …,Xn)
2. Apply the chain rule Predecessors of Xi in
the total order defined
P(X1, …,Xn) = ∏ni= 1 P(Xi | X1, … ,Xi-1) over the variables

3. For each Xi, , select the smallest set of predecessors Pa(Xi)


such that
Xi is conditionally
independent from all its
P(Xi | X1, … ,Xi-1) = P (Xi | Pa(Xi)) other predecessors
given Pa(Xi)

4. Then we can rewrite


P(X1, …,Xn) = ∏ni= 1 P (Xi | Pa(Xi))
• This is a compact representation of the initial JPD
• factorization of the JPD based on existing conditional independencies
20
among the variables
How to build a Bayesian network (cont’d)
5. Construct the Bayesian Net (BN)
• Nodes are the random variables
• Draw a directed arc from each variable in Pa(Xi) to Xi
• Define a conditional probability table (CPT) for each
variable Xi:
• P(Xi | Pa(Xi))

21
Example for BN construction: Fire Diagnosis
You want to diagnose whether there is a fire in a building
• You can receive reports (possibly noisy) about whether everyone is
leaving the building
• If everyone is leaving, this may have been caused by a fire alarm
• If there is a fire alarm, it may have been caused by a fire or by
tampering
• If there is a fire, there may be smoke
Start by choosing the random variables for this domain, here all are Boolean:
• Tampering (T) is true when the alarm has been tampered with
• Fire (F) is true when there is a fire
• Alarm (A) is true when there is an alarm
• Smoke (S) is true when there is smoke
• Leaving (L) is true if there are lots of people leaving the building
• Report (R) is true if the sensor reports that lots of people are leaving the
building
Next apply the procedure described earlier
22
Example for BN construction: Fire Diagnosis
1. Define a total ordering of variables:
- Let’s chose an order that follows the causal sequence of events
- Fire (F), Tampering (T), Alarm, (A), Smoke (S) Leaving (L) Report
(R)
2. Apply the chain rule

P(F,T,A,S,L,R) =

23
24
Example for BN construction: Fire Diagnosis
1. Define a total ordering of variables:
- Let’s chose an order that follows the causal sequence of events
- Fire (F), Tampering (T), Alarm, (A), Smoke (S) Leaving (L) Report
(R)
2. Apply the chain rule

P(F,T,A,S,L,R) =
P(F)P (T | F) P (A | F,T) P (S | F,T,A) P (L | F,T,A,S) P (R | F,T,A,S,L)

We will do steps 3, 4 and 5 together, for each element P(Xi | X1, … ,Xi-1) of
the factorization
3. For each variable (Xi), choose the parents Parents(Xi) by evaluating
conditional independencies, so that
P(Xi | X1, … ,Xi-1) = P (Xi | Parents (Xi))
4. Rewrite
P(X1, …,Xn) = ∏ni= 1 P (Xi | Parents (Xi))
5. Construct the Bayesian network 25
Fire Diagnosis Example
P(F)P (T | F) P (A | F,T) P (S | F,T,A) P (L | F,T,A,S) P (R | F,T,A,S,L)

Fire

Fire (F) is the first variable in the ordering, X1. It does not have
parents.

26
Example
P(F)P (T | F) P (A | F,T) P (S | F,T,A) P (L | F,T,A,S) P (R | F,T,A,S,L)

Tampering Fire

• Tampering (T) is independent of fire (learning that one is


true/false would not change your beliefs about the
probability of the other)

27
Example
P(F)P (T ) P (A | F,T) P (S | F,T,A) P (L | F,T,A,S) P (R | F,T,A,S,L)

Tampering Fire

• Tampering (T) is independent of fire (learning that one is


true/false would not change your beliefs about the
probability of the other)

28
Fire Diagnosis Example
P(F)P (T ) P (A | F,T) P (S | F,T,A) P (L | F,T,A,S) P (R | F,T,A,S,L)

Tampering Fire

Alarm

• Alarm (A) depends on both Fire and Tampering: it could


be caused by either or both

29
Fire Diagnosis Example
P(F)P (T | F) P (A | F,T) P (S | F,T,A) P (L | F,T,A,S) P (R | F,T,A,S,L)

Tampering Fire

Alarm
Smoke

• Smoke (S) is caused by Fire, and so is independent of


Tampering and Alarm given whether there is a Fire

30
Fire Diagnosis Example
P(F)P (T | F) P (A | F,T) P (S | F) P (L | F,T,A,S) P (R | F,T,A,S,L)

Tampering Fire

Alarm
Smoke

• Smoke (S) is caused by Fire, and so is independent of


Tampering and Alarm given whether there is a Fire

31
Example
P(F)P (T | F) P (A | F,T) P (S | F) P (L | F,T,A,S) P (R | F,T,A,S,L)

Tampering Fire

Alarm
Smoke

Leaving

• Leaving (L) is caused by Alarm, and thus is independent


of the other variables given Alarm

32
Fire Diagnosis Example
P(F)P (T ) P (A | F,T) P (S | F) P (L | A) P (R | F,T,A,S,L)

Tampering Fire

Alarm
Smoke

Leaving

Report

• Report ( R) is caused by Leaving, and thus is independent


of the other variables given Leaving

33
Fire Diagnosis Example
P(F)P (T ) P (A | F,T) P (S | F) P (L | A) P (R | L)

Tampering Fire

Alarm
Smoke

Leaving

Report

The result is the Bayesian network above, and its corresponding, very
compact factorization of the original JPD

P(F,T,A,S,L,R)= P(F)P (T ) P (A | F,T) P (S | F) P (L | A) P (R | L)


34
Example for BN construction: Fire Diagnosis

• Note that we intermixed steps 3, 4 and 5, just because sometime it is


easier to reason about conditional dependencies graphically
• However, you can do step 3 and 4 first
• That this, you can simplify the product before building the network
• Still have to reason about dependencies between each node and its
predecessors in the total order

P(F)P (T | F) P (A | F,T) P (S | F,T,A) P (L | F,T,A,S) P (R | F,T,A,S,L)

35
36
Fire Diagnosis Example
P(F)P (T ) P (A | F,T) P (S | F) P (L | A) P (R | L)
5. Construct the Bayesian Net (BN)
• Nodes are the random variables
• Draw a directed arc from each variable in Pa(Xi) to Xi
• Define a conditional probability table (CPT) for each variable Xi:
• P(Xi | Pa(Xi))

Tamperi
Fire
ng

Alarm
Smoke

Leaving

Report

37
Lecture Overview

• Recap lecture 19
• Bayesian networks: construction
• Defining Conditional Probabilities in a Bnet
• Considerations on Network Structure (time
permitting)

38
Example for BN construction: Fire Diagnosis

• We are not done yet: must specify the Conditional Probability Table
(CPT) for each variable. All variables are Boolean.
• How many probabilities do we need to specify for this Bayesian network?
• For instance, how many probabilities do we need to explicitly specify
for Fire?

A. 1 B. 2 C. 4 D. 8
39
Example for BN construction: Fire Diagnosis

• We are not done yet: must specify the Conditional Probability Table
(CPT) for each variable. All variables are Boolean.
• How many probabilities do we need to specify for this Bayesian network?
• For instance, how many probabilities do we need to explicitly specify
for Fire? P(Fire): 1 probability –> P(Fire = T)
Because P(Fire = F) = 1 - P(Fire = T)
40
Example for BN construction: Fire Diagnosis
P(Fire=t)
0.01

• How many probabilities do we need to explicitly specify


for Alarm?

41
Example for BN construction: Fire Diagnosis
P(Fire=t)
0.01

• How many probabilities do we need to explicitly specify


for Alarm?
P(Alarm|Tampering, Fire): 4 probabilities, 1 probability
for each of the 4 instantiations of the parents
42
Example for BN construction: Fire Diagnosis
P(Fire=t)
0.01

Tampering T Fire F P(Alarm=t|T,F) P(Alarm=f|T,F) We don’t need to speficy


t t 0.5 0.5 explicitly P(Alarm=f|T,F)
t f 0.85 0.15 since probabilities in each
f t 0.99 0.01 row must sum to 1
f f 0.0001 0.9999

Each row of this table is a conditional probability distribution


43
Example for BN construction: Fire Diagnosis

• How many probabilities do we need to explicitly specify for the whole


Bayesian network?

A. 6 B. 12 C. 20 D. 26-1

44
Example for BN construction: Fire Diagnosis
P(Tampering=t) P(Fire=t)
0.02 0.01

Tampering T Fire F P(Alarm=t|T,F) Fire F P(Smoke=t |F)


t t 0.5 t 0.9
t f 0.85 f 0.01
f t 0.99
f f 0.0001 Alarm P(Leaving=t|A)
t 0.88
Leaving P(Report=t|L) f 0.001
t 0.75
f 0.01

……..probabilities in total, compared to the of the JPD for


P(T,F,A,S,L,R)

45
Example for BN construction: Fire Diagnosis
P(Tampering=t) P(Fire=t)
0.02 0.01

Tampering T Fire F P(Alarm=t|T,F) Fire F P(Smoke=t |F)


t t 0.5 t 0.9
t f 0.85 f 0.01
f t 0.99
f f 0.0001 Alarm P(Leaving=t|A)
t 0.88
Leaving P(Report=t|L) f 0.001
t 0.75
f 0.01

12 probabilities in total, compared to the 26 -1= 63 of the JPD for


P(T,F,A,S,L,R)

46
Example for BN construction: Fire Diagnosis

How many probabilities do we need to specify for this Bayesian network?


 P(Tampering): 1 probability P(T = t)
 P(Alarm|Tampering, Fire): 4 (independent)
– 1 probability for each of the 4 instantiations of the parents
 For all other variables with only one parent
– 2 probabilities: one for the parent being true and one for the parent
being false
47
 In total: 1+1+4+2+2+2 = 12 (compared to 26 -1= 63 for full JPD!)
Example for BN construction: Fire Diagnosis
P(Tampering=t) P(Fire=t)
0.02 0.01

Tampering T Fire F P(Alarm=t|T,F) Fire F P(Smoke=t |F)


t t 0.5 t 0.9

t f 0.85 f 0.01

f t 0.99
Alarm P(Leaving=t|A)
f f 0.0001
t 0.88
f 0.001
Leaving P(Report=t|L)
t 0.75 Once we have the CPTs in the network,
f 0.01 we can compute any entry of the JPD
P(Tampering=t, Fire=f, Alarm=t, Smoke=f, Leaving=t, Report=t) =

P(Tampering=t) x P(Fire=f)xP(Alarm=t| Tampering=t, Fire=f)xP(Smoke=f| Fire = f)xP(Leaving=t|


Alarm=t) x P(Report=t|Leaving=t) =
= 0.02 x (1-0.01) x 0.85 x (1-0.01) x 0.88 x 0.75 = 0.126
48
In Summary
• In a Belief network, the JPD of the variables
involved is defined as the product of the local
conditional distributions
P (X1, … ,Xn) = ∏i P(Xi | X1, … ,Xi-1) = ∏ i P (Xi | Parents(Xi))

• Any entry of the JPD can be computed given the


CPTs in the network

Once we know the JPD, we can answer any


query about any subset of the variables
- (see Inference by Enumeration topic)
Thus, a Belief network allows one to
answer any query on any subset of the
variables

49
Bayesian Networks: Types of Query/Inference
Diagnostic Predictive Mixed Intercausal

Person smokes
Fire Fire happens There is no fire
next to sensor
P(F=t)=1 F=f
P(F|L=t)=? S=t
Fire Fire P(F|A=t,T=t)=?
Alarm
Smoking
at
Alarm Alarm Fire Sensor

Leaving P(A|F=f,L=t)=?

People are leaving Leaving Leaving Alarm


P(L=t)=1
P(L|F=t)=? People are leaving Alarm goes off
P(L=t)=1 P(A=T) = 1

There are algorithms that leverage the Bnet structure to perform


query answer efficiently
- For instance variable elimination, which we will cover soon
- First, however, we will think a bit more about network structure
50
Learning Goals so Far
• Given a JPD
• Marginalize over specific variables
• Compute distributions over any subset of the variables
• Use inference by enumeration
• to compute joint posterior probability distributions over any subset
of variables given evidence
• Define and use marginal and conditional independence
• Build a Bayesian Network for a given domain (structure)
• Specify the necessary conditional probabilities
• Compute the representational savings in terms of number
of probabilities required

51
Compactness
• In a Bnet, how many rows do we need to explicitly
store for the CPT of a Boolean variable Xi with k
Boolean parents?
Compactness
• A CPT for a Boolean variable Xi with k Boolean parents
has 2k rows for the combinations of parent values
• If each variable has no more than k parents, the complete
network requires to specify n2k numbers
• For k<< n, this is a substantial improvement,
• the numbers required grow linearly with n, vs. O(2n) for
the full joint distribution
• E.g., if we have a Bnets with 30 boolean variables, each
with 5 parents
• Need to specify 30*25 probability
• But we need 230 for JPD
Realistic BNet: Liver Diagnosis
Source: Onisko et al., 1999

~ 60 nodes, max 4 parents per node


Need ~ 60 x 24 = 15 x 26 probabilities instead of 260 probabilities for the JPD
Compactness
• What happens if the network is fully connected?
• Or k ≈ n
• Not much saving compared to the numbers needed to
specify the full JPD
• Bnets are useful in sparse (or locally structured)
domains
• Domains in with each component interacts with (is related
to) a small fraction of other components
• What if this is not the case in a domain we need to reason
about?

May need to make simplifying assumptions to reduce


the dependencies in a domain
“Where do the numbers (CPTs) come
from?”
From experts
• Tedious
• Costly
• Not always reliable
From data => Machine Learning
• There are algorithms to learn both structures and
numbers (CPSC 340, CPSC 422)
• Can be hard to get enough data

Still, usually better than specifying the full JPD


What if we use a different ordering?
• What happens if we use the following order:
• Leaving; Tampering; Report; Smoke; Alarm; Fire.

• We end up with a completely different network structure! (try


it as an exercise)
Leaving Tampering

Report Alarm

Smoke Fire

• Which of the two structures is better?


Which Structure is Better?
Leaving Tampering

Report Alarm

Smoke Fire

• Non-causal network is less compact: 1+2+2+4+8+8 = 25 numbers


needed
• Deciding on conditional independence is hard in non-causal directions
• Causal models and conditional independence seem hardwired for
humans!
• Specifying the conditional probabilities may be harder than in causal
direction
• For instance, we have lost the direct dependency between alarm and one
of its causes, which essentially describes the alarm’s reliability (info often
provided by the maker) 58
Example contd.
• Other than that, our two Bnets for the Alarm problem are
equivalent as long as they represent the same probability
distribution
Leaving Tampering

Report Alarm

Smoke Fire

Variable ordering: L,T,R,S,A,F

Variable ordering: T,F,A,S,L,R

P(T,F,A,S,L,R) = P (T) P (F) P (A | T,F) P (L | A) P (R|L) =


= P(L)P(T|L)P(R|L)P(S|L,T)P(A|S,L,T) P(F|S,A,T)
i.e., they are equivalent if the corresponding CPTs are
specified so that they satisfy the equation above
Are there wrong network structures?
• Given an order of variables, a network with arcs in excess
to those required by the direct dependencies implied by
that order are still ok
• Just not as efficient Leaving Tampering

Alarm
Report

Smoke Fire

P (L)P(T|L)P(R|L) P(S|L,R,T) P(A|S,L,T) P(F|S,A,T) =


P (L)P(T|L)P(R|L)P(S|L,T)P(A|S,L,T) P(F|S,A,T)
• One extreme: the fully connected network is always correct
but rarely the best choice
• It corresponds to just applying the chain rule to the JDP, without
leveraging conditional independencies to simplify the factorization
P(L,T,R,S,A,L)= P (L)P(T|L)P(R|L,T)P(S|L,T,R)P(A|S,L,T,R) P(F|S,A,T,L,R)
Are there wrong network structures?
P(L,T,R,S,A,L)= P (L)P(T|L)P(R|L,T)P(S|L,T,R)P(A|S,L,T,R) P(F|S,A,T,L,R)

Leaving Tampering

Alarm
Report

Smoke Fire
Are there wrong network structures?
• How can a network structure be wrong?
• If it misses directed edges that are required
• E.g. an edge is missing below, making Fire conditionally
independent of Alarm given Tampering and Smoke

Leaving Tampering

Report Alarm

Smoke Fire

But they are not:


for instance, P(Fire = t| Smoke = f, Tampering = F, Alarm = T) should
be
higher than P(Fire = t| Smoke = f, Tampering = f),
Are there wrong network structures?
• How can a network structure be wrong?
• If it misses directed edges that are required
• E.g. an edge is missing below: Fire is not conditionally
independent of Alarm | {Tampering, Smoke}

Leaving Tampering

Report Alarm

Smoke Fire

But remember what we said a few slides back.


Sometimes we may need to make simplifying
assumptions - e.g. assume conditional
independence when it does not actually hold – in
order to reduce complexity
Summary of Dependencies in a Bayesian Network
In 1, 2 and 3, X and Y are dependent (grey areas represent existing
evidence/observations)
Y X
1 E Z

2 E

3 E
Z

• In 3, X and Y become dependent as soon as there is evidence on Z or on any


of its descendants.
• This is because knowledge of one possible cause given evidence of the effect
explains away the other cause
Dependencies in a Bayesian Network:
summary
In 1, 2 and 3, X and Y are dependent (grey areas represent existing
evidence/observations)
1
Up(s2)

Y X Power(w0)
1 E Z
Lit(l1)
Z 2
Understood
Material
2 E
Assignment Exam
Grade Grade

3 E
Z
3
Smoking Fire
At Sensor

Alarm

• In 3, X and Y become dependent as soon as there is evidence on Z or on any


of its descendants.
• This is because knowledge of one possible cause given evidence of the effect
explains away the other cause
Or Conditional Independencies
Or, blocking paths for probability propagation. Three ways in which a path
between Y to X (or viceversa) can be blocked, given evidence E

Y E X
1 Z

3
Z

• In 3, X and Y are independent if there is no evidence on their common effect


(recall fire and tampering in the alarm example
Or Conditional Independencies
Or, blocking paths for probability propagation. Three ways in which a path
between Y to X (or viceversa) can be blocked, given evidence E
Up(s2) 1
Y E X
1 Power(w0)
Z
Lit(l1) 2
Z Understood
Material

2 Assignment Exam
Grade Grade

Z Smoking 3
Fire
At Sensor

Alarm

• In 3, X and Y are independent if there is no evidence on their common effect


(recall fire and tampering in the alarm example
Practice in the AISpace Applet
• Open the Belief and Decision Networks applet
• Load the problem: Conditional Independence Quiz
• Click on Independence Quiz
Practice in the AISpace Applet
• Answer Quizzes in the Conditional Independence Quiz Panel
Learning Goals so Far
• Given a JPD
• Marginalize over specific variables
• Compute distributions over any subset of the variables
• Use inference by enumeration
• to compute joint posterior probability distributions over any subset
of variables given evidence
• Define and use marginal and conditional independence
• Build a Bayesian Network for a given domain (structure)
• Specify the necessary conditional probabilities
• Compute the representational savings in terms of number of
probabilities required
• Identify dependencies/independencies between nodes in a Bayesian
Network

Now we will see how to do inference in BNETS

You might also like