0% found this document useful (0 votes)
16 views26 pages

Lecture 15 Probability

The document covers the fundamentals of discrete probability, including definitions of sample space, events, and uniform probability measures. It discusses various experiments, such as coin tosses and dice rolls, and introduces concepts like conditional probability, independence, and Bayes' theorem. Additionally, it provides examples of calculating probabilities for different events and distributions.

Uploaded by

aishaabedin0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views26 pages

Lecture 15 Probability

The document covers the fundamentals of discrete probability, including definitions of sample space, events, and uniform probability measures. It discusses various experiments, such as coin tosses and dice rolls, and introduces concepts like conditional probability, independence, and Bayes' theorem. Additionally, it provides examples of calculating probabilities for different events and distributions.

Uploaded by

aishaabedin0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Discrete Probability

Lecture 15

1
Discrete Probability

The study of probability uses special jargon

• A sample space is a nonempty set


• An experiment is a process that produces a point in a
sample space
• An event is a subset of sample space
• The event space is the power set of the sample space

Experiment: Toss a coin

Sample space: U={H,T}


Event Space: P(U)={Ø, {H}, {T}, {H,T}}
2
Discrete Probability

Remark: Often some of the more interesting events have


special name

Experiment: Toss Two Coin

Sample space: U={HH, HT, TH, TT}


Event Space: Subsets of U

Named events:
Match ={HH, TT}
At least one head = {HT, TH, HH}

3
Uniform Probability Measure

Def: The uniform probability measure on a finite sample


space S assigns to each event E the probability
|E|
p( E ) 
|S|

Sample space: U={H,T}


Event Ø, {H}, {T}, {H,T}
Probability 0 ½ ½ 1

4
Uniform Probability Measure

Def: The uniform probability measure on a finite sample


space S assigns to each event E the probability
|E|
p( E ) 
|S|
Experiment: Toss Two Coin
Sample space: U={HH, HT, TH, TT}

Event Name At least one Tail one of each


Event {HT, TH, TT} {HT, TH}
Probability ¾ 2/4

5
Uniform Probability Measure

Def: The uniform probability measure on a finite sample


space S assigns to each event E the probability
|E|
p( E ) 
|S|
Experiment: Roll a dice
Sample space: U={1,2,3,4,5,6}
Event Name odd even X % 3==2
Event {1,3,5} {2,4,6} {2,5}
Probability 3/6 3/6 2/6

6
Uniform Probability Measure

Def: The uniform probability measure on a finite sample


space S assigns to each event E the probability
|E|
p( E ) 
|S|
Experiment: Roll two dice 11 12 13 14 15 16
Sample space: U={11,….,16,……,61,………66} 21 22 23 24 25 26
31 32 33 34 35 36
Event Name double sum is 9 41 42 43 44 45 46
51 52 53 54 55 56
Even {11, 22,….,66} {36,45,54,63} 61 62 63 64 65 66

Probability 6/36 4/36

7
Uniform Probability Measure

Experiment: deal one card from deck


Sample space: standard 52 card deck
Event Name heart seven
Even
Probability 13/52 4/52

8
Probability of Complementary Event

Let E be an event in finite sample space S, under the


uniform probability distribution. Then

p( E )  1  p( E )
Pf:
1. |S| = |E| +|E|

9
Probability of a Union Event

Let E1 and E2 be an events in a finite sample space S,


under the uniform probability distribution. Then

p( E1  E2 )  p( E1 )  p( E2 )  p( E1  E2 )
Pf:
1. An integer is chosen from the interval [1,2,….,100]. Find
the probability that it is divisible either by 6 or by 15.

P(6|n v 15|n) = p(6|n) +p(15|n)-p(30|n)


= 16/100 + 6/100 -3/100
= 19/100
10
Non-uniform Probability Measure
Here are two non-uniform probability measure for some
familiar sample spaces.

The standard loaded coin has probability p(H)=0.8 and


p(T)=0.2

The standard loaded die has sample space U={1,2,3,4,5,6}

Then assign the singleton {j} the probability j/21


21=1+2+3+4+5+6

11
Bernoulli distribution

A Bernoulli distribution is a probability measure on a


sample space with exactly two points

Flip standard loaded coin


Sample space = {H,T}
Event Space = {Ø {H}, {T}, {H,T}}

pr({H}) = 0.8 pr ({T}) = 0.2


=80/100 = 20/100
= 8/10 = 2/10
=4/5 = 1/5

12
Bernoulli distribution

A Bernoulli distribution is a probability measure on a


sample space with exactly two points

The standard loaded dice induces a Bernoulli distribution


with

pr(even) = 4/7 pr ({odd}) = 3/7


pr(even) = 2/21+4/21+6/21 pr(odd) = 1/21+3/21+5/21
= 12/21 = 9/21
= 4/7 = 3/7

13
Random Variable
A random variable is a real valued function on the domain
of a probability space.

Flip a coin three times. Let the X(t) be the number of


heads that occurs. Then

X(TTT) = 0
X(TTH)=X(THT)=X(HTT) =1
X(THH)=X(HHT)=X(HTH)=2
X(HHH)=3

Since a random variable is a function, it is not a variable


and it is not random. 14
Conditional probability

Let p be a probability distribution on a sample space U


and let Y be an event. Then the conditional probability of
event E given that event Y has occurred is
pr ( E  Y )
pr ( E | Y ) 
pr (Y )
p ( HH  TT )
p ( HH | TT ) 
P(TT )
p ( HH )

P(TT )
1/ 4

3/ 4
1

3 15
Conditional Independence
Roll two fair dice. Let Y be the event that the first die is a
one and the event that the sum is odd. Then

pr ( E  Y )
pr ( E | Y ) 
pr (Y )
p (12,14,16)

1/ 6
3 / 36

1/ 6
3

6
1

2
16
Conditional probability
example:
Toss a balanced die once and record the number on the
top face.
Let E be the event that a 1 shows on the top face.
Let F be the event that the number on the top face is odd.
What is P(E)?
What is the Probability of the event E if we are told
that the number on the top face is odd, that is, we
know that the event F has occurred?

17
Conditional probability
Key idea: The original sample space no longer applies.
The new or reduced sample space is
S={1, 3, 5}

Notice that the new sample space consists only of the


outcomes in F.

P(E occurs given that F occurs) = 1/3

Notation: P(E|F) = 1/3

18
Conditional Independence

Let pr be a probability distribution on a sample space U.


Event E is probabilistically independent if event Y if

pr ( E | Y )  pr ( E )

pr ( E | Y )  pr ( E )  pr ( E  Y ) / pr (Y )
Pr(Y )  pr ( E  Y ) / pr ( E )
 pr (Y  E ) / pr ( E )
 pr (Y | E ) P(A|B)=P(A ˄B)/P(B)
19
Bayes Theorem

• Start with the joint probability distribution:


toothache toothache

catch catch catch catch

cavity 0.108 0.012 0.072 0.008

cavity 0.016 0.064 0.144 0.576

20
Inference by enumeration

• Start with the joint probability distribution:


toothache toothache

catch catch catch catch

cavity 0.108 0.012 0.072 0.008

cavity 0.016 0.064 0.144 0.576

• P(toothache) = 0.108 + 0.012 + 0.016 + 0.064 = 0.2


21
Inference by enumeration
• Start with the joint probability distribution:
toothache toothache

catch catch catch catch

cavity 0.108 0.012 0.072 0.008

cavity 0.016 0.064 0.144 0.576

• Can also compute conditional probabilities:


P(cavity | toothache) = P(cavity  toothache)
P(toothache)
= 0.016+0.064
0.108 + 0.012 + 0.016 + 0.064
= 0.4
22
Inference by enumeration
• Start with the joint probability distribution:
toothache toothache

catch catch catch catch

cavity 0.108 0.012 0.072 0.008

cavity 0.016 0.064 0.144 0.576

• Can also compute conditional probabilities:


P(cavity | toothache) = P(cavity  toothache)
P(toothache)
= 0.108+0.012
0.108 + 0.012 + 0.016 + 0.064
= 0.6
23
Inference by enumeration

toothache toothache
• Start with the joint probability distribution:
catch catch catch catch

cavity 0.108 0.012 0.072 0.008

cavity 0.016 0.064 0.144 0.576

• Can also compute conditional probabilities:


P(cavity | toothache) = 1-0.4 =0.6

24
Normalization
toothache toothache

catch catch catch catch

cavity 0.108 0.012 0.072 0.008

cavity 0.016 0.064 0.144 0.576

• Denominator can be viewed as a normalization constant α

P(Cavity | toothache) = α, P(Cavity, toothache)


= α, [P(Cavity,toothache,catch) + P(Cavity,toothache, catch)]
= α, [<0.108,0.016> + <0.012,0.064>]
= α, <0.12,0.08>
= <0.6,0.4>
25
Thank You

26

You might also like