0% found this document useful (0 votes)
7 views

Lecture Week11

The document discusses hidden Markov models and machine learning. It provides an overview of Markov chains, defines hidden Markov models, and gives examples of how HMMs can be used. It also discusses machine learning fundamentals and federated learning.

Uploaded by

Reedus
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Lecture Week11

The document discusses hidden Markov models and machine learning. It provides an overview of Markov chains, defines hidden Markov models, and gives examples of how HMMs can be used. It also discusses machine learning fundamentals and federated learning.

Uploaded by

Reedus
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Hidden Markov Model

and Machine Learning


Dr. Syed Maaz Shahid

13th May 2024


Outline
• Markov Chain
• Hidden Markov Model
• Working of HMM
• Example scenarios
• Applications of HMM
• Limitations of HMM
• Machine Learning (ML) Preliminaries
• Types of machine learning
• Federated Learning (FL)
Markov Chain
• A Markov chain is a discrete-time and discrete-valued
random process in which each new sample is only dependent
on the previous sample.

• Let 𝑋𝑛 𝑁𝑛=0 be a sequence of random variables taking values


in the countable set Ω.
• Def: 𝑋𝑛 is a Markov chain if for all values of 𝑋𝑘 and all 𝑛

𝑃 𝑋𝑛 = 𝑥𝑛 |𝑋𝑘 = 𝑥𝑘 for all 𝑘 < 𝑛 = 𝑃 𝑋𝑛 = 𝑥𝑛 |𝑋𝑛−1 = 𝑥𝑘−1


Markov Chain
• A Markov chain tells something about the probabilities of
sequences of random variables (states)

• The basic idea behind a Markov chain is to assume that 𝑋𝑘


captures all the relevant information for predicting the future.

Fig: state transition diagram for Markov


chain
Hidden Markov Model
• A Markov chain is useful when we need to compute a
probability for a sequence of observable events.
• What if the events we are interested in are hidden?

• A hidden Markov model (HMM) allows us to talk about both


observed events and hidden events.

• For example: How do you know your wife is happy or not?


• Determine from observable external factors
Hidden Markov Model
• An HMM is specified by the following components:
• states
• transition probability matrix
• observation likelihoods (emission probabilities)
• initial probability distribution over states

• How do we obtain the HMM?


Hidden Markov Model
• Example Scenario: Umbrella World (Scenario from chapter 15 of Russell & Norvig)

• Elspeth Dunsany is an AI researcher.


• Richard Feynman is an AI, its workstation is not connected to the
internet.
• He has noticed that Elspeth sometimes brings an umbrella to work.
• He correctly infers that she is more likely to carry an umbrella on
days when it rains.
Hidden Markov Model
• Richard proposes a hidden Markov model:
• Rain on day t − 1, 𝑅𝑡−1 , makes rain on day t, 𝑅𝑡 , more likely.
• Elspeth usually brings her umbrella 𝑈𝑡 on days when it rains 𝑅𝑡 ,
but not always.
Hidden Markov Model
• Richard learns that the weather changes on 3 out of 10 days,
𝑃 𝑅𝑡 |𝑅𝑡−1 = 0.7, 𝑃 𝑅𝑡 |~𝑅𝑡−1 = 0.3,
• Also, Elspeth sometimes forgets umbrella when it’s raining,
and sometimes brings an umbrella when it’s not raining.
𝑃 𝑈𝑡 |𝑅𝑡 = 0.9, 𝑃 𝑈𝑡 |~𝑅𝑡 = 0.1,
Hidden Markov Model
• The HMM is characterized by three fundamental problems

• Likelihood: Given an HMM 𝜆 = (𝐴, 𝐵) (parameters) and observation


sequence 𝑂, determine the likelihood probability of observed
sequence 𝑃 𝑂 𝜆 .
• Decoding: Given observation sequence 𝑂 and an HMM 𝜆 = (𝐴, 𝐵),
discover the best hidden state sequence 𝑄.
• Learning: Given an observation sequence 𝑂 and the set of states in
the HMM, learn the HMM parameters 𝐴 and 𝐵.
Hidden Markov Model
• Example scenario-2
Hidden Markov Model
• The next state and the current observation solely depend on the
current state only.
Hidden Markov Model
• Likelihood (likelihood of the observation)
Applications of Hidden Markov Model
• Speech Recognition
• observations are acoustic signals, hidden states correspond to the
different sounds
• Natural Language Processing
• observations are the words in the text, hidden states are associated
with the underlying grammar or structure of the text
• Bioinformatics
• Finance
• observations are the stock prices, interest rates, or exchange rates,
hidden states correspond to different economic states
Limitations of Hidden Markov Models
• Limited Modeling Capabilities

• Overfitting

• Lack of Robustness

• Computational Complexity
Assignment
• Find a paper that uses HMM to solve a problem in your
relevant field.

• Make a report and submit by 26 May, 20204.


What is Machine Learning
• Machine learning is a discipline of artificial intelligence (AI).
• It provides machines with the ability to automatically learn from data
and past experiences while identifying patterns to make predictions
with minimal human intervention.

• Machine learning algorithms employ statistics to detect


patterns in massive amounts of data.
• Data could be anything: numbers, words, images, signals, or
anything else.
How Does Machine Learning Work?
Categorization of Machine Learning
Supervised vs Unsupervised Learning
Federated Learning (FL)
• Federated Learning addresses the challenges of privacy, security, and
data decentralization.
Federated Learning-Training Mechanism
Issues and Challenges in FL
• Communication Efficiency
• Heterogeneity of Clients
• Non-Independent and Identically Distributed (IID) Data
FL in 5G Networks

Fig: Federated learning Fig: Hierarchical FL Fig: Proposed FL architecture


in VTC- spring 2024

You might also like