0% found this document useful (0 votes)
20 views27 pages

Elements of Probability: Statistics (ECON 511)

The document outlines the fundamentals of probability, including its definitions, axioms, and interpretations, such as frequentist and subjective. It covers key concepts like sample spaces, events, independence, conditional probability, and Bayes' theorem, along with examples to illustrate these principles. Additionally, it discusses permutations and combinations in the context of probability problems.

Uploaded by

olzhas1602
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views27 pages

Elements of Probability: Statistics (ECON 511)

The document outlines the fundamentals of probability, including its definitions, axioms, and interpretations, such as frequentist and subjective. It covers key concepts like sample spaces, events, independence, conditional probability, and Bayes' theorem, along with examples to illustrate these principles. Additionally, it discusses permutations and combinations in the context of probability problems.

Uploaded by

olzhas1602
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Elements of Probability

Statistics (ECON 511)

Josef Ruzicka

Department of Economics
School of Sciences and Humanities
Nazarbayev University

1 / 27
Overview

1 Notions for probability

2 Axioms of probability

3 Calculating probabilities

4 Independence

5 Problems

• Reading: Ross, Chapter 3

2 / 27
Two interpretations of probability

• There are two interpretations of probability:


• Frequentist – if we could repeat an experiment many times, then the
frequency with which an outcome occurs coincides with its probability.
So probability is a property of the outcome.
• Subjective (personal) – expresses beliefs of a person, it can be helpful
when making decisions under uncertainty.
• The mathematical properties of probability are the same, irrespective of
the interpretation.

3 / 27
Sample space and events
• Consider an experiment whose outcome is uncertain, but we know the
set of all possible outcomes. This set is called the sample space and
denoted S.
• Examples:
• 1. S = {s, ns} where s means there is a storm in a given city on a given
day, while ns means no storm occurs there.
• 2. In a race among seven horses, labeled {1, 2, 3, 4, 5, 6, 7}, we are
interested in their position. For example, one possible outcome is
s = (3, 5, 1, 6, 4, 7, 2). The sample space is

S = {all orderings of (1, 2, 3, 4, 5, 6, 7)}

• 3. Suppose we are interested in measuring how long (in days) a light


bulb will keep working. It may break immediately after being switched
on, but it may also last very long. We set S = [0, ∞).
• Any subset of S is called an event and denoted E . If the outcome is in
E , we say that E has occurred.
4 / 27
Sample space and events

• Some events in the previous examples:


• 1. The only events are {s}, {ns}, {s, ns}, ∅.
• 2. An event is:
{Horse 3 is in position 7.}
Another event:

{Horse 3 has a lower position than horse 1.}

• 3. An event is:

{The light bulb lasts between 180 and 360 days.}

5 / 27
Sample space and events
Definition
For any events E and F we define:
• The event that any outcome in E or F occurs is the union of E and F ,
denoted E ∪ F .
• The event that any outcome in both E and F occurs is the intersection of E
and F , denoted EF or E ∩ F .
• When E ∩ F = ∅, we say that E and F are mutually exclusive.
• When E does not occur, we call this event the complement of E , denoted
EC.
• When all outcomes in E are also in F , we write E ⊂ F .
For any events E1 , E2 , . . . , En , we define
• The event that an outcome in any event Ei , i ∈ {1, 2, . . . , n} occurs is the
union of E1 , E2 , . . . , En , denoted E1 ∪ E2 ∪ · · · ∪ En or ∪ni=1 Ei .
• The event that an outcome in all events Ei , i ∈ {1, 2, . . . , n} occurs is the
intersection of E1 , E2 , . . . , En , denoted E1 E2 . . . En or ∩ni=1 Ei .
6 / 27
Venn diagrams and the algebra of events

Theorem
Take any events E , F , G . Then:
• E ∪ F = F ∪ E , EF = FE (commutative law)
• (E ∪ F ) ∪ G = E ∪ (F ∪ G ), (EF )G = E (FG ) (associative law)
• (E ∪ F )G = EG ∪ FG , EF ∪ G = (E ∪ G )(F ∪ G ) (distributive law)
• (E ∪ F )C = E C F C , (EF )C = E C ∪ F C (DeMorgan’s laws)

• These laws can be illustrated by Venn diagrams.

7 / 27
Axioms of probability
Definition (Probability)
For any event E of an experiment with sample space S, the probability of
event E , denoted P(E ), is a real number such that:
1. 0 ≤ P(E ) ≤ 1
2. P(S) = 1
3. For any sequence of mutually exclusive events E1 , E2 , . . . it holds
n n
!
[ X
P Ei = P(Ei ), where n = 1, 2, . . . , ∞
i=1 i=1

Theorem
P(E C ) = 1 − P(E )

Theorem
P(E ∪ F ) = P(E ) + P(F ) − P(EF )
8 / 27
Sample spaces with equally likely outcomes
• In many experiments, the number of possible outcomes is finite and
each of them is equally likely.
• Formally, S = {1, 2, . . . , N} and P(i) = N1 .
• It follows that for any event E
number of elements in E
P(E ) =
N

Example
There are 7 black and 4 white balls in a bowl. We draw two balls at
random. What is the probability that one is black and one is white?
There are 11 balls is total, so the first draw could be any of the 11 balls,
while the second one must be one of the remaining 10. In total there are
11 · 10 possibilities. Out of these, there are 7 · 4 ways to draw black ball
first and white next. There are also 4 · 7 possibilities to draw white ball
first and black second. Thus, the number of favorable outcomes is
56
28 + 28 = 56 and P = 110 .
9 / 27
Sample spaces with equally likely outcomes

• The number of ways that n distinct objects can be ordered is denoted


n! and it is equal to

n! = n(n − 1)(n − 2) · · · 3 · 2
• Each such ordering is called a permutation.
• Suppose we have n objects and we are interested in forming groups of
size r . Their number is
 
n n!
=
r (n − r )!r !

• Each such grouping is called a combination.

10 / 27
Conditional probability

Definition
Let E , F be events such that P(F ) > 0. The conditional probability of
event E given F is defined as

P(EF )
P(E |F ) =
P(F )

Example
There are 20 balls in a bowl: 8 white, 2 black, 1 yellow, and 9 red. A ball has
been drawn and we know it isn’t white. What is the probability that it isn’t
black?
1+9
P(ball neither black nor white) 20 5
P(ball not black|ball not white) = = 12 =
P(ball not white) 20
6

11 / 27
Bayes’ formula

• Take any events E , F . We can express E as

E = EF ∪ EF C

• Provided 0 < P(F ) < 1, it follows

P(E ) = P(EF ) + P(EF C )


= P(E |F )P(F ) + P(E |F C )P(F C )
= P(E |F )P(F ) + P(E |F C )(1 − P(F ))

12 / 27
Bayes’ formula
Example
Suppose a student is answering a question on a multiple-choice test. There are m
possible answers. The probability that the student knows the answer is p. If the
student doesn’t know the answer, it’s chosen at random. What is the conditional
probability that the student knows the answer given that the student answered
correctly?
Denote K the event that the student knows the answer. Denote C the event that the
student chooses the correct answer. We need to find P(K |C ). Bayes’ formula yields

P(C ) = P(C |K )P(K ) + P(C |K C )(1 − P(K ))

We know P(K ) = p. Also, P(C |K ) = 1, P(C |K C ) = 1


m
. Thus,
1 pm + 1 − p
P(C ) = 1 · p + (1 − p) =
m m
From the definition of conditional probability,
P(KC ) P(C |K )P(K ) 1·p pm
P(K |C ) = = = pm+1−p
=
P(C ) P(C ) m
pm + 1 − p

For instance, if p = 0.5 and m = 4, then P(K |C ) = 0.8.


13 / 27
Bayes’ formula

Example
Suppose 1% of the population has a disease and consider a laboratory test with
the following characteristics. When the person has a disease, the test is positive
with probability 95%. When a person doesn’t have the disease, the test is
positive 5% of the time. What is the conditional probability that a person has
the disease given a positive test?
Denote D the event that the person has the disease. Denote T the event that
the test is positive. We need to find P(D|T ). We know P(D) = 0.01. Also,
P(T |D) = 0.95, P(T |D C ) = 0.05. Thus,

P(DT ) P(T |D)P(D)


P(D|T ) = =
P(T ) P(T |D)P(D) + P(T |D C )P(D C )
0.95 · 0.01
= ≈ 0.16
0.95 · 0.01 + 0.05 · 0.99
So if you test positive, the probability that you actually have the disease is only
16%.

14 / 27
Bayes’ formula – general form
• Let’s split the sample space S into n mutually disjoint events, that is
n
[
S= Fi
i=1

where Fi Fj = ∅ for all i, j ∈ {1, 2, . . . , n}. (Such a representation of S is called


a partition of S.)
• Then for any event E we can write
n
[
E= EFi
i=1
n
X n
X
P(E ) = P(EFi ) = P(E |Fi )P(Fi )
i=1 i=1

• The general form of the Bayes’ formula is


P(EFj ) P(E |Fj )P(Fj )
P(Fj |E ) = = n
P(E ) X
P(E |Fi )P(Fi )
i=1
15 / 27
Independent events

Definition (Independence of two events)


We say that two events E and F are independent if

P(EF ) = P(E )P(F )

When two events are not independent, we say they are dependent.
P(EF ) P(E )P(F )
• When E and F are independent, P(E |F ) = P(F ) = P(F ) = P(E ).
• When E and F are independent, then E and F C are independent, too.
• Proof: P(EF C ) = P(E ) − P(EF ) = P(E ) − P(E )P(F ) =
P(E )(1 − P(F )) = P(E )P(F C )

16 / 27
Independent events

Definition (Independence of three events)


We say that three events E , F and G are independent if

P(EFG ) = P(E )P(F )P(G )


P(EF ) = P(E )P(F )
P(EG ) = P(E )P(G )
P(FG ) = P(F )P(G )

• It’s possible that among 3 events, each two of them are independent,
but the 3 events together are not independent.
• However, when 3 events are independent, then any two of them are
also independent.

17 / 27
Blitzstein & Hwang, Problem 1

• How many ways are there to permute the letters in the word
MISSISSIPPI?
• Before addressing that, let’s consider an easier problem: How many
ways are there to permute the letters in the word CHELYABINSK?
• There are 11 letters, all different, so the answer is 11! (which means
11 · 10 · 9 · 8 · 7 · 6 · 5 · 4 · 3 · 2 = 39916800).
• More detailed answer:
Placing the letters C,H,E,L,Y,A,B,I,N,S,K into
There are 11 spots where the letter C can be placed.
Now that C has been placed, there are 10 places left. So there are 10
ways how H can be placed.
Now that C and H have been placed, there are 9 places left. So there are
9 ways how E can be placed. And so on.

18 / 27
Blitzstein & Hwang, Problem 1
• Let’s go back to the original problem: How many ways are there to
permute the letters in the word MISSISSIPPI?
• I repeats 4 times, S repeats 4 times, P repeats twice.
• Suppose for a moment the letters I, S, P carry indices, e.g.
MI1 S1 S2 I2 S3 S4 I3 P1 P2 I4 . Then the answer would be 11!.
• But this isn’t what we are looking for – we need to remove the all the
artificial indices – then we can’t distinguish e.g. MI1 S2 S1 I2 S3 S4 I3 P1 P2 I4
from MI1 S1 S2 I2 S3 S4 I3 P1 P2 I4 .
• For each of such newly created words, how many ways are there to
arrange P1 and P2 , while keeping the remaining letters in their position?
There are only 2 ways. They both correspond to the same word, once
the indices are removed.
• How many ways are there to arrange I1 , I2 , I3 and I4 , while keeping the
remaining letters in their position? 4!
• How many ways are there to arrange S1 , S2 , S3 and S4 , while keeping the
remaining letters in their position? 4!
• Thus, each word without indices corresponds to 2 · 4! · 4! words with
indices
11!
• We conclude there are 2·4!·4! = 34650 ways to permute the letters in the
word MISSISSIPPI.
19 / 27
Blitzstein & Hwang, Problem 36

• A group of 30 dice are thrown. What is the probability that 5 of each


of the values 1, 2, 3, 4, 5, 6 appear?
• Looks tough, let’s consider an easier problem first.
• A group of 4 dice are thrown. What is the probability that 1 of each of
the values 1, 2, 3, 4 appear?
• Examples of outcomes from the sample space: (1,3,6,1), (2,5,4,1),
(6,4,5,5), etc.
• Total number of possible outcomes = number of elements in the sample
space: 64
• Examples of favorable outcomes: (1,2,3,4), (2,4,1,3), (3,1,2,4), etc.
• Total number of favorable outcomes: 4!
• Thus, the probability that 1 of each of the values 1, 2, 3, 4 appear is:
4!
64 ≈ 0.019

20 / 27
Blitzstein & Hwang, Problem 36

• Before delving into the original problem, let’s consider yet another
easier problem.
• A group of 4 dice are thrown. What is the probability that 2 of each of
the values 1, 2 appear?
• Number of possible outcomes: 64
• Favorable outcomes (all of them): (1,1,2,2), (1,2,1,2),
(1,2,2,1),(2,1,1,2),(2,1,2,1),(2,2,1,1)
• Number of favorable outcomes: 6 (which can be also expressed as
4! 4·3·2
2!2! = 2·2 = 6)
• Thus, the probability that 2 of each of the values 1, 2 appear is:
6
64 ≈ 0.005

21 / 27
Blitzstein & Hwang, Problem 36
• Let’s dare to solve to the original problem!
• A group of 30 dice are thrown. What is the probability that 5 of each
of the values 1, 2, 3, 4, 5, 6 appear?
• Number of possible outcomes: 630
30!
• Number of favorable outcomes: (5!) 6

• Why? An example of a favorable outcome:


(1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6)
• Remember MISSISSIPPI
If we put artificial indices to distinguish the digits, there would be 30!
permutations.
But many of such permutations are indistinguishable when the indices
are not there – we have to account for it.
How many ways can we rearrange 11 , 12 , 13 , 14 , 15 , while keeping the
remaining digits in their place? 5! Thus, once we remove all the indices,
each of the newly created permutations is indistinguishable from the
(5!)6 permutations in total.
• We conclude that the probability that 5 of each of the values 1, 2, 3, 4,
30!
(5!)6
5, 6 appear is: 630
≈ 0.0004
22 / 27
Problem 1
We toss a (fair) coin twice and record the outcomes. What is the probability of the following
events?
(a) Both are heads.
(b) Both are tails.
(c) One is heads and the other is tails.
(d) At least one is heads.
Now the coin is tossed three times. What is the probability of the following events?
(e) Two are heads and one is tails.
(f) All three are heads.
Now the coin is tossed five times. What is the probability of the following events?
(g) Three are heads and two are tails.
(h) All five are heads.

• (a): 1/4, (b): 1/4, (c) 1/2, (d) 3/4


• (e) 3/23 = 3/8, (f) 1/23 = 1/8
• (g) 52 /25 = 5/16, (h) 1/25 = 1/32


23 / 27
Problem 2
Suppose that in a class of 40 students there are 36 right-handed students and 4 left-handed ones. Two students are selected
at random. What is the probability of the following events?
(a) Both are right-handed.
(b) Both are left-handed.
(c) One is right-handed and the other is left-handed.
(d) At least one is right-handed.
Now suppose that instead of selecting two students, we select three students at random. What is the probability of the
following events?
(e) Two are right-handed and one is left-handed.
(f) All three are right-handed.
Now suppose that instead of selecting three students, we select five students at random. What is the probability of the
following events?
(g) Three are right-handed and two are left-handed.
(h) All five are right-handed.

• (a): 36/40 · 35/39


• (b): 4/40 · 3/39
• (c) 36/40 · 4/39 + 4/40 · 36/39
• (d) 1 −P(Both

are left-handed.) = 1 − 4/40 · 3/39
36 4
2
• (e)  1
40
3
    
36 4 36
3
• (f)  0 = 3
40 40
3 3
  
36 4
3
• (g)  2
40
 5
36
• (h) 5
40
5
24 / 27
Problem 3
A bag contains 8 white balls, 8 black balls and 8 red balls. We draw one ball from the bag, record its color, put it back into
the bag and mix the balls in the bag well. Then we draw one more ball.
(a) What is the probability that both drawn balls have the same color?
(b) How does the result change if we don’t return the first-drawn ball back into the bag?

• Let us first denote the events:


• Wi means the ith drawn ball is white
• Bi means the ith drawn ball is black
• Ri means the ith drawn ball is red
• (a):
• P(both have the same color) = P(W1 ∩ W2 ) + P(B1 ∩ B2 ) + P(R1 ∩ R2 )
• By symmetry, P(both have the same color) = 3P(W1 ∩ W2 )
• P(W1 ∩ W2 ) = 8/24 · 8/24 = 1/9
• Thus, P(both have the same color) = 3 · 1/9 = 1/3
• (b):
• The first two steps are the same
• P(W1 ∩ W2 ) = 8/24 · 7/23 = 1/3 · 7/23
• Thus, P(both have the same color) = 3 · 1/3 · 7/23 = 7/23
• Notice: 7/23 ≈ 0.30 < 1/3
25 / 27
Problem 4
We roll six dice. What is the probability that the numbers we get are
(a) all different from each other?
(b) only even numbers?

• (a):
• Total number of outcomes: 66
• Number of favorable outcomes: 6!
6!
• P(all different from each other) = 66
• (b):
• Total number of outcomes: 66
• Number of favorable outcomes: 36
6
• P(all even numbers) = 636 = 216

26 / 27
Problem 5
A professor strolled downtown with an umbrella. Every time the professor visits a bookshop with an umbrella, the umbrella
gets forgotten there with probability 0.5. The professor has visited four bookshops and has come home without an umbrella.
What is the probability the umbrella has been left in the last bookshop?

• Let us first denote and describe the events:


• Ui means the umbrella is forgotten in the ith bookshop (i ∈ {1, 2, 3, 4}).
• Ui implies the umbrella is not forgotten in the jth bookshop where j < i.
• P(U1 ) = 21 , P(U2 ) = 212 , P(U3 ) = 213 , P(U4 ) = 214
• P(U1C ∩ U2C ∩ U3C ∩ U4C ) = 214

• P(Umbrella in 4th bookshop given no umbrella at home) is the same as


4 ∩(U1 ∪U2 ∪U3 ∪U4 ))
P(U4 |U1 ∪ U2 ∪ U3 ∪ U4 ) = P(UP(U 1 ∪U2 ∪U3 ∪U4 )
=
1
P(U4 ) 24 1
P(U1 )+P(U2 )+P(U3 )+P(U4 )
= 1+ 1 + 13 + 14
= 15
2 22 2 2
27 / 27

You might also like