0% found this document useful (0 votes)
4 views2 pages

Bayes and HMM

Uploaded by

purid9991
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views2 pages

Bayes and HMM

Uploaded by

purid9991
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Bayes’ Theorem and Hidden Markov Models (HMMs)

1. Introduction to Bayes’ Theorem

Definition: Bayes' Theorem provides a way to update our beliefs based on new evidence.

Formula:

Components:

• P(A∣B): Posterior probability — probability of hypothesis A given data B.

• P(B∣A): Likelihood — probability of data B given hypothesis A.

• P(A): Prior probability — initial probability of hypothesis A.

• P(B): Marginal likelihood — probability of observing B.

Applications:

• Medical diagnosis: Updating disease probability with symptoms.

• Spam filtering: Classifying emails as spam or not based on keywords.

Introduction to Hidden Markov Models (HMMs)

Definition: HMMs are statistical models where the system is assumed to be a Markov process with
hidden states.

Key Components:

1. States: The hidden variables (e.g., weather conditions).

2. Observations: Visible outcomes linked to hidden states (e.g., activities influenced by the
weather).

3. Transition Probabilities: Probabilities of moving from one state to another.

4. Emission Probabilities: Probabilities of an observation given a state.

5. Initial Probabilities: Probability distribution over initial states.

Applying Bayes’ Theorem to HMMs

• Inference: Using Bayes’ Theorem in HMMs helps estimate hidden states based on
observations.

• Example Scenario: Given a sequence of observations, determine the most probable


sequence of hidden states.
Solving HMM Problems with Bayes’ Theorem

• Evaluation: Calculate the likelihood of a sequence of observations.

• Decoding (Viterbi Algorithm): Find the most probable sequence of hidden states.

• Learning (Baum-Welch Algorithm): Adjust model parameters to maximize the likelihood of


observations.

Practical Example of HMM with Observations

• Setup: Observations of an individual's activities (e.g., walking, shopping, cleaning) linked to


weather conditions.

• Model:

• States: Sunny, Rainy, Cloudy.

• Observations: Different activities.

• Inference Steps:

1. Calculate the initial probabilities for each state.

2. Use the Viterbi Algorithm to find the sequence with the highest probability.

This outline provides a coherent flow for a lecture covering Bayes’ Theorem and its application in
Hidden Markov Models, incorporating concepts, formulas, and practical examples typically used in
such discussions. If there are specific details from the video you would like included, please let me
know!

Detailed Explanation of Practical Example in HMM

Go through the attached video for reference

You might also like