0% found this document useful (0 votes)
8 views15 pages

4DC10 Week 1 Lecture

Uploaded by

gamingotakusinfo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views15 pages

4DC10 Week 1 Lecture

Uploaded by

gamingotakusinfo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Analysis of Production Systems (4DC10)

Week 1 (Probability models)

Michel Reniers

Department of Mechanical Engineering


Probability models

Relevant in many areas, such as:

• Manufacturing
• Computer networks (Internet)
• Supply chains
• Statistical physics
• Quantum physics
• Random heterogeneous materials
• Biology
• Computer simulation

Learning goal: Be able to formulate and analyze basic probability models


2
Magician

A magician takes a random card from a deck of 52 cards. You have to guess which card
he is holding.

Before guessing you may ask one of the following two questions:

1. Is the card a black one (spades or clover)?


2. Is the card jack of hearts?

Question: Which question will maximize the probability of guessing the right card?

A Question 1
B Question 2
C Does not matter
3
Magician (Answer)
• “Is the card a black one?”

With probability 21 the answer is “yes”. Then 26 cards remain to choose from, so
1
your probability of guessing the right one is 26 . A similar reasoning applies for
answer “no”.
1 1 1 1 1
The probability of guessing the right one is 2 · 26 + 2 · 26 = 26

4
Magician (Answer)

• ‘Is the card jack of hearts?”.


1
With probability 52 the answer is yes and you are done.

51
With probability 52 the answer is no, and then the probability of guessing the right
1
one is 51 .

1 51 1 1
So the probability of guessing the right one is 52 ·1+ 52 · 51 = 26
4
Magician (Answer)
• “Is the card a black one?”

With probability 21 the answer is “yes”. Then 26 cards remain to choose from, so
1
your probability of guessing the right one is 26 . A similar reasoning applies for
answer “no”.
1 1 1 1 1
The probability of guessing the right one is 2 · 26 + 2 · 26 = 26
• ‘Is the card jack of hearts?”.
1
With probability 52 the answer is yes and you are done.

51
With probability 52 the answer is no, and then the probability of guessing the right
1
one is 51 .

1 51 1 1
So the probability of guessing the right one is 52 ·1+ 52 · 51 = 26
4
Monty Hall dilemma

• It is the climax of a game-show.


• You have to choose a door out of three, behind one of them is the car of your
dreams and behind the others a goat.
• You choose a door without opening it. The host (knowing what is behind the
doors) then opens one of the remaining doors, showing a goat.

Question: Now you are given the opportunity to switch doors: Are you going to switch?

A Yes
B No
C Why bother, there is no difference

5
Monty Hall dilemma (Answer)

6
Our intuition on probability: Empirical law of large numbers
• Throwing a fair coin: Fraction of Heads should be 1
2 in the long run

• Relative frequency of event H (Head) in n repetitions of throwing a coin is


n(H)
fn (H) =
n
where n(H) is number of times Head occurred in the n repetitions

• Then
1
fn (H) → as n → ∞.
2

• In general: The relative frequency of event E approaches a limiting value as the


number of repetitions tends to infinity.

• Intuitively we would define the probability of event E as this limiting value.

7
Ingredients of a probability model

Sample space S can be discrete or continuous: Events are subsets of S:


• Flipping a coin, S = {H, T} E = {H}
• Rolling a die, S = {1, 2, ... , 6} E = {1, 2}
• Rolling a die twice, S = {(i, j) | i, j = 1, 2, ... , 6} E = {(1, 2), (3, 4), (5, 6)}
• Process times, S = [0, ∞) E = (0, 1)
• Number of machine failures, S = {0, 1, 2, ...} E = {3, 4, ...}
• Throwing darts, S = {(x, y) | x2 + y 2 ≤ 1} E = {(x, y), 0 ≤ x ≤ 1
4, 0 ≤ y ≤ 41 }
p

Probability measure P : 2S → [0, 1] that assigns a probability to events

8
Probabilities on Events

For each event E there is a number P(E) (the probability of E) such that:
• 0 ≤ P(E) ≤ 1
• P(S) = 1
• E1 , E2 , ... mutually disjoint (have nothing in common), then
P(E1 ∪ E2 ∪ · · · ) = P(E1 ) + P(E2 ) + · · ·

E1 E2 E3 E4

9
Probabilities on Events

For each event E there is a number P(E) (the probability of E) such that:
• 0 ≤ P(E) ≤ 1
• P(S) = 1
• E1 , E2 , ... mutually disjoint (have nothing in common), then
P(E1 ∪ E2 ∪ · · · ) = P(E1 ) + P(E2 ) + · · ·

Examples:
• Flipping a coin, P({H}) = P({T}) = 1
2
• Rolling a die, P({1}) = 61 , P({1, 2}) = P({1}) + P({2}) = 1
3
• Rolling a die twice, P({(i, j)}) = 1
36
• Throwing darts, P(E) = area of E, divided by area of unit disk

10
Probabilities on Events

For each event E there is a number P(E) (the probability of E) such that:
• 0 ≤ P(E) ≤ 1
• P(S) = 1
• E1 , E2 , ... mutually disjoint (have nothing in common), then
P(E1 ∪ E2 ∪ · · · ) = P(E1 ) + P(E2 ) + · · ·

Examples:
• Discrete
PS, assign p(s) ≥ 0 for each s ∈ S such that p(s) = 1. Then
P
s∈S
P(E) = s∈E p(s)
• Equally likely outcomes s1 , ... , sN , so p(si ) = 1/N and
|E|
P(E) =
N
11
Some basic rules that follow from the ingredients

• If E ⊂ F, then P(E) ≤ P(F)


• If finitely many E1 , E2 , ... , En are mutually disjoint, then
P(E1 ∪ E2 ∪ · · · ∪ En ) = P(E1 ) + P(E2 ) + · · · + P(En )

• P(E c ) = 1 − P(E) where E c is the complement of E, so E c = S \ E


• P(E ∪ F) = P(E) + P(F) − P(EF) where EF = E ∩ F

E EF F

12
Theoretical law of large numbers
• Flipping a coin an unlimited number of times, then an outcome is an infinite
sequence s of Heads and Tails,
s = (H, T, T, H, H, H, T, ...).

• Let Kn (s) denote the number of Heads in the first n flips of outcome s.
• Then according the theoretical (strong) law of large numbers,
Kn (s) 1
lim =
n→∞ n 2
with probability 1
• In general: If an experiment is repeated an unlimited number of times, and if the
experiments are independent of each other, then the fraction of times event E
occurs converges with probability 1 to P(E).
• The method of computer simulation is based on this law!

13

You might also like