0% found this document useful (0 votes)
11 views32 pages

Operations Research Lesson 3

The document discusses decision-making under uncertainty, focusing on probability concepts, random variables, and probability distributions. It explains different types of probability, including objective and subjective, and introduces key laws and formulas related to joint, marginal, and conditional probabilities. Additionally, it covers Bayes' theorem for revising probabilities based on new information.

Uploaded by

Ahmed Mithila
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views32 pages

Operations Research Lesson 3

The document discusses decision-making under uncertainty, focusing on probability concepts, random variables, and probability distributions. It explains different types of probability, including objective and subjective, and introduces key laws and formulas related to joint, marginal, and conditional probabilities. Additionally, it covers Bayes' theorem for revising probabilities based on new information.

Uploaded by

Ahmed Mithila
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

Preliminaries for Decision Making

under Uncertainty

Md. Thasinul Abedin


MBA(Accounting); MSc (Economics and Finance)
Assistant Professor of Accounting and Finance
University of Chittagong

January 20, 2025

1 / 32
Probability, Random Variables,
Expectation, and Distribution

▶ Probability
▶ Random Variables
▶ Expectation
▶ Probability Distribution

2 / 32
Probability [1]

A probability is a numerical statement about the


chance that an event will occur.
Examples:
1. There is a 50% chance of raining today (
P(Raining) = 0.50).
2. There is a 60% chance that he will fail in the
exam (P(Fail in Exam) = 0.60).

3 / 32
Probability [2]

Two types of Probability:


1. Objective Probability
2. Subjective Probability

4 / 32
Probability [3]

Objective Probability: We can compute it


logically by looking into the past histories. It is
also known as classical or logical probability. We
use the following formula to compute objective
probability.
P (Φ) = Nn ; Here, Φ =Name of an event; n =
Number of occurrences of the event; N = Total
number of trials or outcomes. For example,
probability of tossing a fair coin once and getting a
head is P (Head) = 21 .

5 / 32
Probability [4]

Subjective Probability: When logic and past


histories are not appropriate, probability values
can be assessed subjectively. The accuracy of
subjective probabilities depends on the experience
and judgement of the person making the estimates.
For example, what is the probability that the price
of gasoline will be more than $4 in the next few
years? What is the probability that our economy
will be in a severe depression in 2035?

6 / 32
Probability [5]

Mutually Exclusive Events: Events are


mutually exclusive if one is included it will
automatically exclude the others!
Mutually Exhaustive Events: Events are
mutually exhaustive if the list of events include
every possible event.
Note: Many common experiences involve events
that are both mutually exclusive and collectively
exhaustive!

7 / 32
Probability [6]

Figure: Mutually Exclusive, Collectively Exhaustive, and


Both

8 / 32
Probability [7]
Law-1: If A and B are mutually exclusive, then
we can write
P (A or B) = P (A ∪ B) = P (A) + P (B)

Figure: A and B are Mutually Exclusive


9 / 32
Probability [8]
Law-2: If A and B are not mutually exclusive,
then we can write
P (A ∪ B) = P (A) + P (B) − P (A ∩ B)
Here, P (A ∩ B) = P (A and B)

Figure: A and B are Mutually Inclusive


10 / 32
Probability [9]

Concepts of Joint, Marginal, and


Conditional Probability: These concepts arise
when two or more events occur together
(simultaneously).
To understand these concepts well, we must know
about the nature of independent and dependent
events!

11 / 32
Probability [10]

Independent and Dependent Events: Two


events are independent when the occurrence of one
event has no effect on the probability of occurrence
of the second event.

Figure: Dependent Events

12 / 32
Probability [11]

Figure: Independent Events

13 / 32
Probability [12]

Law-3: If A and B are independent, then


(a) Joint Probability: P (A ∩ B) = P (A).P (B)
(b) Conditional Probability:
P (A/B) = P P(A∩B)
(B) = P (A) and
P (A∩B)
P (B/A) = P (A) = P (B)
(b) Marginal Probability: P (A) = P (A/B)
and P (B) = P (B/A)

14 / 32
Probability [13]

Law-4: If A and B are dependent, then


(a) Joint Probability:
P (A ∩ B) = P (A/B).P (B) = P (B/A).P (A)
(b) Conditional Probability:
P (A/B) = P P(A∩B)
(B) and P (B/A) =
P (A∩B)
P (A)
P (A∩B)
(c) Marginal Probability: P (A) = P (B/A) and
P (A∩B)
P (B) = P (A/B)

15 / 32
Random Variable
Definition: A random variable assigns a real
number to every possible outcome or event in an
experiment. It might be either discrete or
continuous.

16 / 32
Probability Distribution [1]
Once we put the values of a random variable
corresponding to the probabilities into a table, it is
called probability distribution of that random
variable.

Figure: Probability Distribution of Random variable x


17 / 32
Probability Distribution [2]
Once we plot the probabilities of a random
variable corresponding to its values, we will get a
shape. This shape is also known as probability
distribution.

Figure: Uniform Probability Distribution


18 / 32
Probability Distribution Function

A discrete random variable x is said to have a


probability distribution function iff:
1. P (xi ) ≥ 0; ∀i = 1, 2, · · · , n
Pn
2. i=1 P (xi ) = 1

19 / 32
Expected Value and Variance

(1) The expected value of a random variable x can


be written as E(x) where,
E(x) = ni=1 xi P (xi )
P

(2) The variance of a random variable x can be


written as V ar(x) where,
V ar(x) = ni=1 (xi − E(x))2 P (xi ).
P
p
Note: Standard Deviation, σ = V ar(x)

20 / 32
Joint Probability Distribution
Once we put the values of a joint random variable
corresponding to its probabilities, we will get joint
probability distribution.

Figure: Probability Distribution of a Joint Random


Variable (x, y)

21 / 32
Joint Probability Distribution Function

A joint random variable (x, y) is a said to have a


probability distribution function iff:
1. P (xi , yj ) ≥ 0; ∀i = 1, 2, · · · , m and
j = 1, 2, · · · , n
Pm Pn
2. i=1 j=1 P (xi , yj ) = 1

22 / 32
Marginal Probability Distribution
Function

1. P (xi ) = nj=1 P (xi , yj )


P

2. P (yj ) = m
P
i=1 P (xi , yj )

23 / 32
Joint Probability Distribution Function

P (xi ,yj ) P (x ,y )
1. P (xi /yj ) = = Pm i j
P (yj ) i=1 P (xi ,yj )
P (xi ,yj ) P (xi ,yj )
2. P (yj /xi ) = P (xi ) = Pn
j=1 P (xi ,yj )

Note: If x and y are independent, then


P (xi , yj ) = P (xi )P (yj ).

24 / 32
Example-1 [1]

Assume that we have an Urn containing 10 balls of


the following descriptions:
4 are white (W) and lettered (L)
2 are white (W) and numbered (N)
3 are yellow (Y) and lettered (L)
1 is yellow (Y) and numbered (n)
Assume I draw a ball.

25 / 32
Example-1 [2]

4 3
Now, P (W ∩ L) = 10 ; P (Y ∩ L) = 10
2 1
P (W ∩ N ) = 10 ; P (Y ∩ N ) = 10
6
P (W ) = P (W ∩ L) + P (W ∩ N ) = 10
4
P (Y ) = P (Y ∩ L) + P (Y ∩ N ) = 10
7
P (L) = P (W ∩ L) + P (Y ∩ L) = 10
3
P (N ) = P (W ∩ N ) + P (Y ∩ N ) = 10
P (L/Y ) = PP(Y(Y∩L) 3/10
) = 4/10 = 0.75

26 / 32
Probability Revision and Bayes’
Theorem [1]
Bayes’ theorem is used to incorporate additional
information when it is available and help generate
revised (posterior) probabilities. This means that
we can take new or recent data and then revise our
old probability estimates (prior probabilities).

27 / 32
Probability Revision and Bayes’
Theorem [2]
Let, prior probabilities are P (A) and P (A′ ) and
new information set, ϕ = {P (B/A), P (B/A′ )}.
Therefore, the posterior probabilities are:

P (A ∩ B) P (B/A).P (A)
P (A/B) = =
P (B) P (B ∩ A) + P (B ∩ A′ )
P (B/A).P (A)
=
P (B/A).P (A) + P (B/A′ ).P (A′ )

28 / 32
Probability Revision and Bayes’
Theorem [3]

′ P (A′ ∩ B) P (B/A′ ).P (A′ )


P (A /B) = =
P (B) P (B ∩ A) + P (B ∩ A′ )
P (B/A′ ).P (A′ )
=
P (B/A).P (A) + P (B/A′ ).P (A′ )

29 / 32
Example-2 [1]

A cup contains two dice identical in appearance.


One, however, is fair (unbiased) and the other is
loaded (biased). The probability of rolling a 3 on
the fair die is 0.166. The probability of tossing the
same number on the loaded die is 0.60. Since we
randomly selected the die to roll, the probability of
it being fair or loaded is 0.50.
Required: Find revised probabilities.

30 / 32
Example-2[2]

Prior Probalities are, P (F ) = P (L) = 0.5


New Information Set,
ϕ = {P (3/F ), P (3/L)} = {0.166, 0.6}
Now,
P (3) = P (3 ∩ F ) + P (3 ∩ L) = P (3/F ).P (F ) +
P (3/L).P (L) = 0.166 × 0.5 + 0.6 × 0.5 = 0.383

31 / 32
Example-2[3]

Posterior (revised) Probabilities:


P (F ∩3) P (3/F ).P (F ) 0.166×0.5
P (F/3) = P (3) = 0.383 = 0.383 = 0.22
P (L∩3) P (3/L).P (L) 0.6×0.5
P (L/3) = P (3) = 0.383 = 0.383 = 0.78

32 / 32

You might also like