0% found this document useful (0 votes)
5 views38 pages

5 Probabilistic Reasoning

The document provides an overview of probabilistic reasoning, highlighting the concept of uncertainty and its causes, such as unreliable information and experimental errors. It explains various types of probabilities, including conditional, unconditional, and joint probabilities, along with their significance in real-world applications and artificial intelligence. Additionally, it discusses how to calculate these probabilities and the importance of understanding the relationships between events.

Uploaded by

sachinch01432
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views38 pages

5 Probabilistic Reasoning

The document provides an overview of probabilistic reasoning, highlighting the concept of uncertainty and its causes, such as unreliable information and experimental errors. It explains various types of probabilities, including conditional, unconditional, and joint probabilities, along with their significance in real-world applications and artificial intelligence. Additionally, it discusses how to calculate these probabilities and the importance of understanding the relationships between events.

Uploaded by

sachinch01432
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

Probabilistic Reasoning

 Introduction to Probabilistic Reasoning


 Bayes and Markov Network, DBN’s and HMN’s

COMPILED BY: ER. SANTOSH PANDEYA 1


What is uncertainty?

 The lack of the exact knowledge that would enable us to reach a perfectly reliable
conclusion
 Classical Logic permits only exact reasoning i.e. perfect knowledge always exists

In Real world
IF A is true
THEN A is not false and IF B is true
THEN B is not false

Causes of uncertainty:
Following are some leading causes of uncertainty to occur in the real world.
1.Information occurred from unreliable sources.
2.Experimental Errors
3.Equipment fault
4.Temperature variation
5.Climate change.

COMPILED BY: ER. SANTOSH PANDEYA 2


Sources of Uncertain Knowledge

Weak Implication: Domain experts and knowledge engineer have rather painful or hopeless task of establishing concrete
correlation between IF(Condition) and THEN(action) part of rules Vague Data

Imprecise Language: NLP is ambiguous and imprecise We define facts


in terms of often, sometimes, frequently, hardly ever Such can affect
IF THEN implication

Unknown Data: incomplete and missing data should be processed to an


approx reasoning with this values

Combining the views of different experts: Large system uses data from many experts

 The basic Concept of probability plays significant role in our life like we try to determine the probability of rain,
prospect of promotion, likely hood of winning in Black Jack

 The probability of an event is the proportion of cases in which the event occurs (Good, 1959)

 Probability, mathematically, is indexed between 0 and 1

COMPILED BY: ER. SANTOSH PANDEYA 3


Probabilistic reasoning:

 Probabilistic reasoning is a way of knowledge representation where we apply the concept of probability to indicate
the uncertainty in knowledge.
 In probabilistic reasoning, we combine probability theory with logic to handle the uncertainty.
 We use probability in probabilistic reasoning because it provides a way to handle the uncertainty that is the result
of someone's laziness and ignorance.
 In the real world, there are lots of scenarios, where the certainty of something is not confirmed, such as "It will rain
today," "behavior of someone for some situations," "A match between two teams or two players." These are
probable sentences for which we can assume that it will happen but not sure about it, so here we use probabilistic
reasoning.

Need of probabilistic reasoning in AI:

 When there are unpredictable outcomes.


 When specifications or possibilities of predicates becomes too large to handle.
 When an unknown error occurs during an experiment.

COMPILED BY: ER. SANTOSH PANDEYA 4


 The basic Concept of probability plays significant role in our life like we try to determine the probability of rain,
prospect of promotion, likely hood of winning in Black Jack

 The probability of an event is the proportion of cases in which the event occurs (Good, 1959)

 Probability, mathematically, is indexed between 0 and 1

 Most events have probability index strictly between 0 and 1, which means that each event has at least two possible
outcomes: favorable outcome or success and unfavorable outcomes or failure

COMPILED BY: ER. SANTOSH PANDEYA 5


If s is the number of success and f is the number of failure then:

COMPILED BY: ER. SANTOSH PANDEYA 6


 Let us consider classical examples with a coin and a dice. If we throw a coin, the probability of getting a head will
be equal to the probability of getting a tail. In a single throw, s = f = 1, and therefore the probability of getting a
head (or a tail) is 0.5.

 Consider now a dice and determine the probability of getting a 6 from a single throw. If we assume a 6 as the only
success, then s = 1 and f = 5, since there is just one way of getting a 6, and there are five ways of not getting a 6 in
a single throw. Therefore, the probability of getting a 6 is

Above instances are for independent events i.e. mutually exclusive events which can not happen simultaneously

COMPILED BY: ER. SANTOSH PANDEYA 7


 Above instances are for independent events i.e. mutually exclusive events which can not happen simultaneously

 In the dice experiment, the two events of obtaining a 6 and, for example, a 1 are mutually exclusive because we
cannot obtain a 6 and a 1 simultaneously in a single throw.

 However, events that are not independent may affect the likelihood of one or the other occurring.

 Consider, for instance, the probability of getting a 6 in a single throw, knowing this time that a 1 has not come up.
There are still five ways of not getting a 6, but one of them can be eliminated as we know that a 1 has not been
obtained. Thus,

p=1/1+(5−1)

COMPILED BY: ER. SANTOSH PANDEYA 8


Probability can be defined as a chance that an uncertain event will occur. It is the numerical measure of the likelihood that
an event will occur. The value of probability always remains between 0 and 1 that represent ideal uncertainties.

0 ≤ P(A) ≤ 1, where P(A) is the probability of an event A.

P(A) = 0, indicates total uncertainty in an event A.

P(A) =1, indicates total certainty in an event A.

We can find the probability of an uncertain event by using the below formula.

COMPILED BY: ER. SANTOSH PANDEYA 9


Conditional probability:
Conditional probability is a probability of occurring an event when another event has already happened.

Let's suppose, we want to calculate the event A when event B has already occurred, "the probability of A under the
conditions of B", it can be written as:

Where P(A⋀B)= Joint probability of a and B


P(B)= Marginal probability of B.
If the probability of A is given and we need to find the probability of B, then it will be given as:

COMPILED BY: ER. SANTOSH PANDEYA 10


It can be explained by using the below Venn diagram, where B is occurred event, so sample space will be reduced to set B,
and now we can only calculate event A when event B is already occurred by dividing the probability of P(A⋀B) by P( B ).

COMPILED BY: ER. SANTOSH PANDEYA 11


Example:
In a class, there are 70% of the students who like English and 40% of the students who likes English and
mathematics, and then what is the percent of students those who like English also like mathematics?
Solution:
Let, A is an event that a student likes Mathematics
B is an event that a student likes English.

Hence, 57% are the students who like English also like Mathematics.

COMPILED BY: ER. SANTOSH PANDEYA 12


Conditional Probability
 Simple probability deals with independent events.
 If we know the probability of A, p(A) p(B).
 If two events are interdependent and the outcome of one affects the outcome of other, then we need
to consider conditional probability.
 The conditional probability p(B|A), indicates the probability of event B given that we know event has
occurred

P(B|A) =𝑃(𝐴∩𝐵)/𝑃(𝐴) P(A) ≠0


Note: A∩B denotes the events that occurs if and only if both A and B simultaneously in the same performance of the
experiment under consideration.

COMPILED BY: ER. SANTOSH PANDEYA 13


Unconditional Probability
 Given random variable X, P(X=x) denotes the unconditional probability
of prior probability that X has value x in the absence of any other
information

 So prior probability or unconditional probability is the degree of belief


according to a proposition in the absence of any other information

COMPILED BY: ER. SANTOSH PANDEYA 14


Joint Probability
 A joint probability is the probability of event A and event B happening, P(A and B).
It is the likelihood of the intersection of two or more events.
 The probability of the intersection of A and B is written as P(A ∩ B).

 For example, the likelihood that a card is black and seven is equal to P(Black and
Seven) = 2/52 = 1/26. (There are two Black-7 in a deck of 52: the 7 of clubs and
the 4 of spades).

 Conditional probability is one type of probability in which the possibility of an


event depends upon the existence of a previous event.
 As this type of event is very common in real life, conditional probability is often
used to determine the probability of such cases.

COMPILED BY: ER. SANTOSH PANDEYA 15


 Conditional probability describes the likelihood of an event (A) happening given that
another event (B) has already occurred.

 In probability notation, this is denoted as A given B, expressed as P(A|B), indicating


that the probability of event A is dependent on the occurrence of event B.

 To know about conditional probability, we need to be familiar with independent


events and dependent events. Let’s understand conditional probability,

COMPILED BY: ER. SANTOSH PANDEYA 16


 Conditional Probability is defined as the probability of any event occurring when another event has already occurred.
 In other words, it calculates the probability of one event happening given that a certain condition is satisfied.
 It is represented as P (A | B) which means the probability of A when B has already happened.

For Example, let’s consider the case of rolling two dice, sample space of this event is as follows:

{(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6),
(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6),
(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6),
(4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6),
(5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6),
(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)}

Now, consider an event A = getting 3 on the first die and B = getting a sum of 9.

Then the probability of getting 9 when on the first die it’s already 3 is P(B | A), which can be calculated as follows:
All the cases for the first die as 3 are (3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6).
In all of these cases, only one case has a sum of 9.
Thus, P (B | A) = 1/36.

COMPILED BY: ER. SANTOSH PANDEYA 17


In case, we have to find P (A | B),

All cases where the sum is 9 are (3, 6), (4, 5), (5, 4), and (6, 3).

In all of these cases, only one case has 3 on the first die i.e., (3, 6)

Thus, P(A | B) = 1/36.

As we can calculate the Conditional Probability of simple cases without any formula, just as we have seen in the
above heading but for complex cases, we need a Conditional Probability Equation as we can’t possibly count all the
cases for those. Let’s consider two events A and B, then the formula for conditional probability of A when B has
already occurred is given by:

P(A|B) = P (A ∩ B) / P(B)

Where,

P (A ∩ B) represents the probability of both events A and B occurring simultaneously.


P(B) represents the probability of event B occurring.

COMPILED BY: ER. SANTOSH PANDEYA 18


How to Calculate Conditional Probability?
To calculate the conditional probability, we can use the following step-by-step method:

Step 1: Identify the Events. Let’s call them Event A and Event B.

Step 2: Determine the Probability of Event A i.e., P(A)

Step 3: Determine the Probability of Event B i.e., P(B)

Step 4: Determine the Probability of Event A and B i.e., P(A∩B).

Step 5: Apply the Conditional Probability Formula and calculate the required probability.

COMPILED BY: ER. SANTOSH PANDEYA 19


Conditional Probability of Independent Events

When two events are independent, those conditional probability is the same as the probability of the event
individually i.e., P (A | B) is the same as P(A) as there is no effect of event B on the probability of event A. For
independent events, A and B, the conditional probability of A and B with respect to each other is given as follows:

P(B|A) = P(B)
P(A|B) = P(A)

COMPILED BY: ER. SANTOSH PANDEYA 20


What is Joint Probability
 Joint Probability refers to the likelihood of two or more events happening together or in conjunction
with each other.
 It helps answer questions such as, “What is the probability of both event A and event B occurring in a
context?”

What does Joint Probability tell us?


Joint probability offers valuable insights into the likelihood of multiple events happening together. This helps us in
several ways:

1. Co-occurrence: Joint probability helps us understand how likely it is for two or more events to happen at the
same time. This is important for seeing how events are connected and the probability of them occurring
together.

2. Risk Evaluation: In areas like finance and insurance, joint probability helps us assess the risk when multiple events
overlap. For instance, it can estimate the chance of multiple financial instruments facing losses simultaneously.

3. Quality Check: Businesses can use joint probability to gauge the reliability and quality of their products or
processes. It shows the likelihood of multiple defects or issues occurring at once, which allows for proactive quality
improvement efforts.

COMPILED BY: ER. SANTOSH PANDEYA 21


4. Event Relationships: Joint probability can indicate if events are related or not. If joint probability significantly differs
from the product of individual probabilities, it suggests events are connected, and the occurrence of one affects the
likelihood of the other.

5. Decision Support: When businesses need to make choices involving multiple factors or events, joint probability
provides a numerical foundation for decision-making. It helps assess how different variables together impact the
desired outcome.

6. Resource Management: In situations with limited resources, understanding joint probability helps optimise
resource allocation. For example, in supply chain management, it can estimate the chance of multiple supply chain
disruptions happening at the same time, enabling better risk management strategies.

COMPILED BY: ER. SANTOSH PANDEYA 22


Formula for Joint Probability
The formula for calculating joint probability hinges on whether the events are independent or dependent:

1. For Independent Events


When events A and B are independent, meaning that the occurrence of one event does not impact the other, we use
the multiplication rule:
P(A∩B) = P(A) x P(B)
Here, P(A) is the probability of occurrence of event A, P(B) is the probability of occurrence of event B, and P(A∩B) is
the joint probability of events A and B.

2. For Dependent Events


Events are often dependent on each other, meaning that one event’s occurrence influences the likelihood of the
other. Here, we employ a modified formula:
P(A∩B) = P(A) x P(B|A)

Here, P(A) is the probability of occurrence of event A, P(B|A) is the conditional probability of occurrence of event B
when event A has already occurred, and P(A∩B) is the joint probability of events A and B.

COMPILED BY: ER. SANTOSH PANDEYA 23


Bayesian Networks
 A Bayesian Network is a data structure which represents the dependencies among variables and gives
a concise specification of any full joint probability distribution.
 Also called Belief Networks or Probabilistic inference network
 It is a powerful knowledge representation and reasoning mechanism.
 It represents events and causal relationship between uncertainty as conditional probabilities
involving random variables.
 Given the values of subset of these variables (evidence variables) Bayesian networks can compute the
probabilities of another subset of variables (Query variables).

COMPILED BY: ER. SANTOSH PANDEYA 24


 A Bayesian Network is a directed acyclic graph ( DAG), where there is a node for each random variable, and a directed arc
from A to B whenever A is a direct causal influence on B.

 Thus arcs represent direct causal relationships and the nodes represent states of affairs.

 If there is arc from A to B then, A is said to be parent of B.

 Each node X i , has a conditional probability distribution P(X i | Parents(X i )) that quantifies the effect of the parents on
the node. The set of nodes and links is called topology of the network

 In Bayesian Network, the joint probability distribution can be written as the product of the local distributions of each node
and its parents such as:

 Where the set of parent nodes of a node Xiis represented by parents (Xi).

COMPILED BY: ER. SANTOSH PANDEYA 25


 Example: The graph shown in following figure represents a Bayesian
belief network having four nodes A,B,C,D. {A,B} representing evidences
and {C,D} representing hypothesis. Here A and B are unconditional nodes
and C and D are conditional nodes. Arcs describes its dependency.

COMPILED BY: ER. SANTOSH PANDEYA 26


A B P(D)
P(A) P(B)
T T 0.7
0.3 0.7
T F 0.3
F T 0.2
F F 0.01

A P(C)
T 0.4
F 0.3

COMPILED BY: ER. SANTOSH PANDEYA 27


 We know that Joint Probability for four variables can be computed as:
P(A,B,C,D) = P(D|A,B,C) * P(C|A,B) * P(B|A) *P(A).
 However, this can be simplified using Bayesian belief network as some of
these terms will not berequired. For example, C is not dependent on B; D is
not dependent on C.
 Hence P(C|A,B) is reduced to P(C|A) and P(D|A,B,C) is reduced to P(D|A,B).
 Further, A and B are independent, therefore P(B|A) is reduced to P(B).

COMPILED BY: ER. SANTOSH PANDEYA 28


So, P(A,B,C,D) = P(D|A, B) * P(C|A) * P(B) * P(A)
=0.7 *0.4*0.7*0.3
=0.0588

COMPILED BY: ER. SANTOSH PANDEYA 29


Bayesian Networks: Advantages
• Handles situations where some data entries are missing.
• used to predict consequences of intervention
• allow to learn causal relationship
• provides natural representation for
conditionalindependence.

COMPILED BY: ER. SANTOSH PANDEYA 30


Markov Model
The random variables have fixed values in the static
words.
However, there are many cases like sequence of spoken
words, weather forecast etc., Where the environment
changes in time. These are called dynamic problems.
HMM is a widely used model for processing sequential
data.
 It is used in speech recognition, NLP modeling, online
handshaking recognition and for the analysis of biological
sequences such as protein and DNA.

COMPILED BY: ER. SANTOSH PANDEYA 31


Markov Assumption

 It states that the current state depends on only a finite history of previous state.
 The process which follows Markov assumption is called Markov process or Markov
Chain.

COMPILED BY: ER. SANTOSH PANDEYA 32


Markov Model
Consider weather:
 We have three types of weather; Sunny, rainy, and foggy
 Assume weather, lasts all day i.e., does not change from rainy to sunny in the middle of the day
 Weather predictions is all about trying to guess what the weather will be like tomorrow based on
history of observation of weather.
 Assuming we have collected data based on today, yesterday, the day before and so forth.

P (Wn| Wn-1, Wn-2, ……………,W1 )

 Using above equation, we can give probabilities of types of weather for tomorrow and the next day
using n days of history.
 Suppose if we know that the weather for the past three days was { Sunny, Sunny, Foggy } in
chronological order, the probability that tomorrow would be rainy in given by:
P (W4 =Rainy | W3 =Foggy, W2 =Sunny, W1 = Sunny)

Eg: Probabilities of Tomorrow’s weather based on today’s weather

COMPILED BY: ER. SANTOSH PANDEYA 33


Questions:
1. Given that today is Sunny, what’s the probability that tomorrow is sunny and the day after is rainy ?
Solution:
This translates into:
P(w2 = sunny, w3= Rainy | w1 = Sunny)
= P(w3= Rainy | w2 = sunny) * P(w2 = Sunny | w1 = Sunny )
= (0.05) * (0.8)
= 0.04

COMPILED BY: ER. SANTOSH PANDEYA 34


2) Given that today is foggy, what’s the probability that it will be rainy two days from now ?
Solution:
There are three ways to get from foggy today to rainy two days from now: {foggy, foggy, rainy}, {foggy, rainy, rainy} and
{foggy, sunny, rainy}. Therefore, we have to sum over these paths:
P(w3 = Rainy | W1= Foggy) = P(w2 = Foggy, w3 =Rainy | w1 = Foggy ) +
P(w2 = Rainy, w3 =Rainy |w1 = Foggy) +
P(w2 = Sunny, w3 =Rainy | w1 = Foggy)
= P(w3 = Rainy | w2 = Foggy ) * P(w2 = Foggy | w1= Foggy) +
P(w3 = Rainy| w2 = Rainy) * P(w2 = Rainy | w1 = Foggy) +
P(w2 = Rainy | w2 = Sunny) * P(w2 = Sunny | w1 = Foggy)
= (0.3) * (0.5) + (0.6) * (0.3) + (0.05) * (0.2)
= 0.34

COMPILED BY: ER. SANTOSH PANDEYA 35


Hidden Markov Model:
 Suppose you are locked in a room for several days and you are asked for the weather outside.
 The only piece of evidence you have is whether the person who comes into the room carrying your
daily meal is carrying an umbrella or not.
 Let’s suppose the following probabilities

Day Prob of Umbrella


Sunny 0.1
Rainy 0.8
Foggy 0.3

COMPILED BY: ER. SANTOSH PANDEYA 36


Hidden Markov Models (HMM)
Where,
ui is true if caretaker brought an umbrella on day i and false if the caretaker
didn’t

The probability P( u 1 ….. u n | W 1 ….. W n ) can be estimated as :


37
If you assume, for all
i , given W i, u i is independent of all u j and w j

COMPILED BY: ER. SANTOSH PANDEYA 37


Hidden Markov Models (HMM)
 Applications
 Speech Recognition
 Text Processing
 Bio informatics
 Financial

COMPILED BY: ER. SANTOSH PANDEYA 38

You might also like