4DC10 Week 1 Lecture
4DC10 Week 1 Lecture
Michel Reniers
• Manufacturing
• Computer networks (Internet)
• Supply chains
• Statistical physics
• Quantum physics
• Random heterogeneous materials
• Biology
• Computer simulation
A magician takes a random card from a deck of 52 cards. You have to guess which card
he is holding.
Before guessing you may ask one of the following two questions:
Question: Which question will maximize the probability of guessing the right card?
A Question 1
B Question 2
C Does not matter
3
Magician (Answer)
• “Is the card a black one?”
With probability 21 the answer is “yes”. Then 26 cards remain to choose from, so
1
your probability of guessing the right one is 26 . A similar reasoning applies for
answer “no”.
1 1 1 1 1
The probability of guessing the right one is 2 · 26 + 2 · 26 = 26
4
Magician (Answer)
51
With probability 52 the answer is no, and then the probability of guessing the right
1
one is 51 .
1 51 1 1
So the probability of guessing the right one is 52 ·1+ 52 · 51 = 26
4
Magician (Answer)
• “Is the card a black one?”
With probability 21 the answer is “yes”. Then 26 cards remain to choose from, so
1
your probability of guessing the right one is 26 . A similar reasoning applies for
answer “no”.
1 1 1 1 1
The probability of guessing the right one is 2 · 26 + 2 · 26 = 26
• ‘Is the card jack of hearts?”.
1
With probability 52 the answer is yes and you are done.
51
With probability 52 the answer is no, and then the probability of guessing the right
1
one is 51 .
1 51 1 1
So the probability of guessing the right one is 52 ·1+ 52 · 51 = 26
4
Monty Hall dilemma
Question: Now you are given the opportunity to switch doors: Are you going to switch?
A Yes
B No
C Why bother, there is no difference
5
Monty Hall dilemma (Answer)
6
Our intuition on probability: Empirical law of large numbers
• Throwing a fair coin: Fraction of Heads should be 1
2 in the long run
• Then
1
fn (H) → as n → ∞.
2
7
Ingredients of a probability model
8
Probabilities on Events
For each event E there is a number P(E) (the probability of E) such that:
• 0 ≤ P(E) ≤ 1
• P(S) = 1
• E1 , E2 , ... mutually disjoint (have nothing in common), then
P(E1 ∪ E2 ∪ · · · ) = P(E1 ) + P(E2 ) + · · ·
E1 E2 E3 E4
9
Probabilities on Events
For each event E there is a number P(E) (the probability of E) such that:
• 0 ≤ P(E) ≤ 1
• P(S) = 1
• E1 , E2 , ... mutually disjoint (have nothing in common), then
P(E1 ∪ E2 ∪ · · · ) = P(E1 ) + P(E2 ) + · · ·
Examples:
• Flipping a coin, P({H}) = P({T}) = 1
2
• Rolling a die, P({1}) = 61 , P({1, 2}) = P({1}) + P({2}) = 1
3
• Rolling a die twice, P({(i, j)}) = 1
36
• Throwing darts, P(E) = area of E, divided by area of unit disk
10
Probabilities on Events
For each event E there is a number P(E) (the probability of E) such that:
• 0 ≤ P(E) ≤ 1
• P(S) = 1
• E1 , E2 , ... mutually disjoint (have nothing in common), then
P(E1 ∪ E2 ∪ · · · ) = P(E1 ) + P(E2 ) + · · ·
Examples:
• Discrete
PS, assign p(s) ≥ 0 for each s ∈ S such that p(s) = 1. Then
P
s∈S
P(E) = s∈E p(s)
• Equally likely outcomes s1 , ... , sN , so p(si ) = 1/N and
|E|
P(E) =
N
11
Some basic rules that follow from the ingredients
E EF F
12
Theoretical law of large numbers
• Flipping a coin an unlimited number of times, then an outcome is an infinite
sequence s of Heads and Tails,
s = (H, T, T, H, H, H, T, ...).
• Let Kn (s) denote the number of Heads in the first n flips of outcome s.
• Then according the theoretical (strong) law of large numbers,
Kn (s) 1
lim =
n→∞ n 2
with probability 1
• In general: If an experiment is repeated an unlimited number of times, and if the
experiments are independent of each other, then the fraction of times event E
occurs converges with probability 1 to P(E).
• The method of computer simulation is based on this law!
13