0% found this document useful (0 votes)
6 views7 pages

Unit4 SumProductAlgorithm

Uploaded by

vm4512
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views7 pages

Unit4 SumProductAlgorithm

Uploaded by

vm4512
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Sum-product

algorithm
Sum-Product Algorithm for
Hidden Markov Models
• Sum-Product Algorithm : Applied to Hidden Markov Models (HMMs) for computing marginal
distributions of hidden states.

• Efficient Calculation : This algorithm allows for efficient computation by breaking down the
problem into manageable pieces.

• Forward-Backward Algorithm : The sum-product algorithm is essentially equivalent to the


forward-backward algorithm used in HMMs.

• General Framework : Derived from the more general concept of factor graphs and message
passing.
• Graph Transformation: Start by converting the directed graph into a factor graph.
• Representation: The factor graph shows all variables explicitly, both latent and observed.
• Inference Simplification: For solving the inference problem, we condition on the observed
variables x1,x2,…,xN​.
• Factor Graph Simplification: Absorb the emission probabilities into the transition probability
factors.
• Resulting Factors: The transition probability factors, now simplified, include both the
original transition and emission probabilities.

(equation
1)

Simplified Factor
Factor Graph Graph
• Alpha-Beta Algorithm Derivation: Start by denoting the final hidden variable zNz_NzN​as
the root node.
• Message Passing: First, pass messages from the leaf node hhh to the root node.
• Message Propagation: Use the general results (equations 8.66 and 8.69) for message
propagation.
(equation
2)

(equation
3)

• Form of Messages: In the Hidden Markov Model, the propagated messages follow the
structure derived from these general results
we can eliminate using equation 2 using equation 3 to
give a recursion for the messages of the form:

If we now recall the definition equation 1, and if we define


Next we consider the messages that are propagated from the root node back to
the leaf node. These take the form

where, as before, we have eliminated the messages of the type a


since the
variable nodes perform no computation. Using the definition equation 1 to
substitute for and defining:
The sum-product algorithm also specifies how to evaluate the marginals once
all
the messages have been evaluated. In particular, the result (8.63) shows that
the local marginal at the node is given by the product of the incoming
messages. Because we have conditioned on the variables
we are computing the joint distribution:

Dividing both sides by we then obtain:

You might also like