5 Probabilistic Reasoning
5 Probabilistic Reasoning
The lack of the exact knowledge that would enable us to reach a perfectly reliable
conclusion
Classical Logic permits only exact reasoning i.e. perfect knowledge always exists
In Real world
IF A is true
THEN A is not false and IF B is true
THEN B is not false
Causes of uncertainty:
Following are some leading causes of uncertainty to occur in the real world.
1.Information occurred from unreliable sources.
2.Experimental Errors
3.Equipment fault
4.Temperature variation
5.Climate change.
Weak Implication: Domain experts and knowledge engineer have rather painful or hopeless task of establishing concrete
correlation between IF(Condition) and THEN(action) part of rules Vague Data
Combining the views of different experts: Large system uses data from many experts
The basic Concept of probability plays significant role in our life like we try to determine the probability of rain,
prospect of promotion, likely hood of winning in Black Jack
The probability of an event is the proportion of cases in which the event occurs (Good, 1959)
Probabilistic reasoning is a way of knowledge representation where we apply the concept of probability to indicate
the uncertainty in knowledge.
In probabilistic reasoning, we combine probability theory with logic to handle the uncertainty.
We use probability in probabilistic reasoning because it provides a way to handle the uncertainty that is the result
of someone's laziness and ignorance.
In the real world, there are lots of scenarios, where the certainty of something is not confirmed, such as "It will rain
today," "behavior of someone for some situations," "A match between two teams or two players." These are
probable sentences for which we can assume that it will happen but not sure about it, so here we use probabilistic
reasoning.
The probability of an event is the proportion of cases in which the event occurs (Good, 1959)
Most events have probability index strictly between 0 and 1, which means that each event has at least two possible
outcomes: favorable outcome or success and unfavorable outcomes or failure
Consider now a dice and determine the probability of getting a 6 from a single throw. If we assume a 6 as the only
success, then s = 1 and f = 5, since there is just one way of getting a 6, and there are five ways of not getting a 6 in
a single throw. Therefore, the probability of getting a 6 is
Above instances are for independent events i.e. mutually exclusive events which can not happen simultaneously
In the dice experiment, the two events of obtaining a 6 and, for example, a 1 are mutually exclusive because we
cannot obtain a 6 and a 1 simultaneously in a single throw.
However, events that are not independent may affect the likelihood of one or the other occurring.
Consider, for instance, the probability of getting a 6 in a single throw, knowing this time that a 1 has not come up.
There are still five ways of not getting a 6, but one of them can be eliminated as we know that a 1 has not been
obtained. Thus,
p=1/1+(5−1)
We can find the probability of an uncertain event by using the below formula.
Let's suppose, we want to calculate the event A when event B has already occurred, "the probability of A under the
conditions of B", it can be written as:
Hence, 57% are the students who like English also like Mathematics.
For example, the likelihood that a card is black and seven is equal to P(Black and
Seven) = 2/52 = 1/26. (There are two Black-7 in a deck of 52: the 7 of clubs and
the 4 of spades).
For Example, let’s consider the case of rolling two dice, sample space of this event is as follows:
{(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6),
(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6),
(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6),
(4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6),
(5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6),
(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)}
Now, consider an event A = getting 3 on the first die and B = getting a sum of 9.
Then the probability of getting 9 when on the first die it’s already 3 is P(B | A), which can be calculated as follows:
All the cases for the first die as 3 are (3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6).
In all of these cases, only one case has a sum of 9.
Thus, P (B | A) = 1/36.
All cases where the sum is 9 are (3, 6), (4, 5), (5, 4), and (6, 3).
In all of these cases, only one case has 3 on the first die i.e., (3, 6)
As we can calculate the Conditional Probability of simple cases without any formula, just as we have seen in the
above heading but for complex cases, we need a Conditional Probability Equation as we can’t possibly count all the
cases for those. Let’s consider two events A and B, then the formula for conditional probability of A when B has
already occurred is given by:
P(A|B) = P (A ∩ B) / P(B)
Where,
Step 1: Identify the Events. Let’s call them Event A and Event B.
Step 5: Apply the Conditional Probability Formula and calculate the required probability.
When two events are independent, those conditional probability is the same as the probability of the event
individually i.e., P (A | B) is the same as P(A) as there is no effect of event B on the probability of event A. For
independent events, A and B, the conditional probability of A and B with respect to each other is given as follows:
P(B|A) = P(B)
P(A|B) = P(A)
1. Co-occurrence: Joint probability helps us understand how likely it is for two or more events to happen at the
same time. This is important for seeing how events are connected and the probability of them occurring
together.
2. Risk Evaluation: In areas like finance and insurance, joint probability helps us assess the risk when multiple events
overlap. For instance, it can estimate the chance of multiple financial instruments facing losses simultaneously.
3. Quality Check: Businesses can use joint probability to gauge the reliability and quality of their products or
processes. It shows the likelihood of multiple defects or issues occurring at once, which allows for proactive quality
improvement efforts.
5. Decision Support: When businesses need to make choices involving multiple factors or events, joint probability
provides a numerical foundation for decision-making. It helps assess how different variables together impact the
desired outcome.
6. Resource Management: In situations with limited resources, understanding joint probability helps optimise
resource allocation. For example, in supply chain management, it can estimate the chance of multiple supply chain
disruptions happening at the same time, enabling better risk management strategies.
Here, P(A) is the probability of occurrence of event A, P(B|A) is the conditional probability of occurrence of event B
when event A has already occurred, and P(A∩B) is the joint probability of events A and B.
Thus arcs represent direct causal relationships and the nodes represent states of affairs.
Each node X i , has a conditional probability distribution P(X i | Parents(X i )) that quantifies the effect of the parents on
the node. The set of nodes and links is called topology of the network
In Bayesian Network, the joint probability distribution can be written as the product of the local distributions of each node
and its parents such as:
Where the set of parent nodes of a node Xiis represented by parents (Xi).
A P(C)
T 0.4
F 0.3
It states that the current state depends on only a finite history of previous state.
The process which follows Markov assumption is called Markov process or Markov
Chain.
Using above equation, we can give probabilities of types of weather for tomorrow and the next day
using n days of history.
Suppose if we know that the weather for the past three days was { Sunny, Sunny, Foggy } in
chronological order, the probability that tomorrow would be rainy in given by:
P (W4 =Rainy | W3 =Foggy, W2 =Sunny, W1 = Sunny)