Operations Research Lesson 3
Operations Research Lesson 3
under Uncertainty
1 / 32
Probability, Random Variables,
Expectation, and Distribution
▶ Probability
▶ Random Variables
▶ Expectation
▶ Probability Distribution
2 / 32
Probability [1]
3 / 32
Probability [2]
4 / 32
Probability [3]
5 / 32
Probability [4]
6 / 32
Probability [5]
7 / 32
Probability [6]
8 / 32
Probability [7]
Law-1: If A and B are mutually exclusive, then
we can write
P (A or B) = P (A ∪ B) = P (A) + P (B)
11 / 32
Probability [10]
12 / 32
Probability [11]
13 / 32
Probability [12]
14 / 32
Probability [13]
15 / 32
Random Variable
Definition: A random variable assigns a real
number to every possible outcome or event in an
experiment. It might be either discrete or
continuous.
16 / 32
Probability Distribution [1]
Once we put the values of a random variable
corresponding to the probabilities into a table, it is
called probability distribution of that random
variable.
19 / 32
Expected Value and Variance
20 / 32
Joint Probability Distribution
Once we put the values of a joint random variable
corresponding to its probabilities, we will get joint
probability distribution.
21 / 32
Joint Probability Distribution Function
22 / 32
Marginal Probability Distribution
Function
2. P (yj ) = m
P
i=1 P (xi , yj )
23 / 32
Joint Probability Distribution Function
P (xi ,yj ) P (x ,y )
1. P (xi /yj ) = = Pm i j
P (yj ) i=1 P (xi ,yj )
P (xi ,yj ) P (xi ,yj )
2. P (yj /xi ) = P (xi ) = Pn
j=1 P (xi ,yj )
24 / 32
Example-1 [1]
25 / 32
Example-1 [2]
4 3
Now, P (W ∩ L) = 10 ; P (Y ∩ L) = 10
2 1
P (W ∩ N ) = 10 ; P (Y ∩ N ) = 10
6
P (W ) = P (W ∩ L) + P (W ∩ N ) = 10
4
P (Y ) = P (Y ∩ L) + P (Y ∩ N ) = 10
7
P (L) = P (W ∩ L) + P (Y ∩ L) = 10
3
P (N ) = P (W ∩ N ) + P (Y ∩ N ) = 10
P (L/Y ) = PP(Y(Y∩L) 3/10
) = 4/10 = 0.75
26 / 32
Probability Revision and Bayes’
Theorem [1]
Bayes’ theorem is used to incorporate additional
information when it is available and help generate
revised (posterior) probabilities. This means that
we can take new or recent data and then revise our
old probability estimates (prior probabilities).
27 / 32
Probability Revision and Bayes’
Theorem [2]
Let, prior probabilities are P (A) and P (A′ ) and
new information set, ϕ = {P (B/A), P (B/A′ )}.
Therefore, the posterior probabilities are:
P (A ∩ B) P (B/A).P (A)
P (A/B) = =
P (B) P (B ∩ A) + P (B ∩ A′ )
P (B/A).P (A)
=
P (B/A).P (A) + P (B/A′ ).P (A′ )
28 / 32
Probability Revision and Bayes’
Theorem [3]
29 / 32
Example-2 [1]
30 / 32
Example-2[2]
31 / 32
Example-2[3]
32 / 32