0% found this document useful (0 votes)
80 views2 pages

CS 109: Probability For Computer Scientists Section 1: Intro To Probability

CS 109 is an introductory course on probability for computer scientists. Section 1 introduces key probability concepts including the product rule, permutations, combinations, multinomial coefficients, the binomial theorem, the principle of inclusion-exclusion, the pigeonhole principle, and complementary counting. It also defines important probability terms like the sample space, events, unions, intersections, mutually exclusive events, complements, and partitions. The section concludes by covering axioms of probability, equally likely outcomes, conditional probability, independence, Bayes' theorem, the law of total probability, and the chain rule.

Uploaded by

Jordi Arnau
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
80 views2 pages

CS 109: Probability For Computer Scientists Section 1: Intro To Probability

CS 109 is an introductory course on probability for computer scientists. Section 1 introduces key probability concepts including the product rule, permutations, combinations, multinomial coefficients, the binomial theorem, the principle of inclusion-exclusion, the pigeonhole principle, and complementary counting. It also defines important probability terms like the sample space, events, unions, intersections, mutually exclusive events, complements, and partitions. The section concludes by covering axioms of probability, equally likely outcomes, conditional probability, independence, Bayes' theorem, the law of total probability, and the chain rule.

Uploaded by

Jordi Arnau
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

CS 109: Probability for Computer Scientists

Section 1: Intro to Probability

0. Review of Main Concepts


(a) Product Rule: Suppose there are m1 possible outcomes for event A1 , then m2 possible Q outcomes for
event A2 ,. . . , mn possible outcomes for event An . Then there are m1 · m2 · m3 · · · mn = ni=1 mi possible
outcomes overall.

(b) Number of ways to order n distinct objects: n! = n · (n − 1) · · · 3 · 2 · 1

(c) Number of ways to select from n distinct objects:

(a) Permutations (number of ways to linearly arrange k objects out of n distinct objects, when the
order of the k objects matters):

n!
P (n, k) =
(n − k)!
(b) Combinations (number of ways to choose k objects out of n distinct objects, when the order of the
k objects does not matter):
 
n! n
= = C(n, k)
k!(n − k)! k
(d) Multinomial coefficients: Suppose there are n objects, but only k are distinct, with k ≤ n. (For
example, “godoggy” has n = 7 objects (characters) but only k = 4 are distinct: (g, o, d, y)). Let ni be
the number of times object i appears, for i ∈ {1, 2, . . . , k}. (For example, (3, 2, 1, 1), continuing the
“godoggy” example.) The number of distinct ways to arrange the n objects is:
 
n! n
=
n1 !n2 ! · · · nk !
n1 , n 2 , . . . , n k

(e) Binomial Theorem: ∀x, y ∈ R, ∀n ∈ N: (x + y)n = nk=0 nk xk y n−k


P 

(f) Principle of Inclusion-Exclusion (PIE): 2 events: |A ∪ B| = |A| + |B| − |A ∩ B|


3 events: |A ∪ B ∪ C| = |A| + |B| + |C| − |A ∩ B| − |A ∩ C| − |B ∩ C| + |A ∩ B ∩ C|
In general: +singles - doubles + triples - quads + . . .

(g) Pigeonhole Principle: If there are n pigeons with k holes and n > k, then at least one hole contains at
least 2 (or to be precise, d nk e) pigeons.

(h) Complementary Counting (Complementing): If asked to find the number of ways to do X, you can:
find the total number of ways and then subtract the number of ways to not do X.

(i) Key Probability Definitions

(a) Sample Space: The set of all possible outcomes of an experiment, denoted Ω or S
(b) Event: Some subset of the sample space, usually a capital letter such as E ⊆ Ω
(c) Union: The union of two events E and F is denoted E ∪ F
(d) Intersection: The intersection of two events E and F is denoted E ∩ F or EF
(e) Mutually Exclusive: Events E and F are mutually exclusive iff E ∩ F = ∅
(f) Complement: The complement of an event E is denoted E C or E or ¬E, and is equal to Ω \ E

1
(g) DeMorgan’s Laws: (E ∪ F )C = E C ∩ F C and (E ∩ F )C = E C ∪ F C
(h) Probability of an event E: denoted Pr(E) or Pr(E) or P (E)
(i) Partition: Nonempty events E1 , . . . , En partition the sample space Ω iff
• E1 , . . . , En are exhaustive: E1 ∪ E2 ∪ · · · ∪ En = ni=1 Ei = Ω, and
S

• E1 , . . . , En are pairwise mutually exclusive: ∀i 6= j, Ei ∩ Ej = ∅


– Note that for any event A (with A 6= ∅, A 6= Ω): A and AC partition Ω

(j) Axioms of Probability and their Consequences

(a) Axiom 1: Non-negativity For any event E, Pr(E) ≥ 0


(b) Axiom 2: Normalization Pr(Ω) = 1
(c) Axiom 3: Countable Additivity If E and F are mutually exclusive, then S Pr(E ∪ FP) ∞= Pr(E) +
Pr(F ). Also, if E1 , E2 , ... is a countable sequence of disjoint events, Pr( ∞
k=1 Ei ) = k=1 Pr(Ei ).
(d) Corollary 1: Complementation Pr(E) + Pr E C = 1


(e) Corollary 2: Monotonicity If E ⊆ F , Pr(E) ≤ Pr(F )


(f) Corollary 2: Inclusion-Exclusion Pr(E ∪ F ) = Pr(E) + Pr(F ) − Pr(E ∩ F )

(k) Equally Likely Outcomes: If every outcome in a finite sample space Ω is equally likely, and E is an
|E|
event, then Pr(E) = .
|Ω|
• Make sure to be consistent when counting |E| and |Ω|. Either order matters in both, or order doesn’t
matter in both.
Pr(A ∩ B)
(l) Conditional Probability: Pr(A | B) =
Pr(B)
(m) Independence: Events E and F are independent iff Pr(E ∩ F ) = Pr(E) Pr(F ), or equivalently Pr(F ) =
Pr(F | E), or equivalently Pr(E) = Pr(E | F )
Pr(B | A) Pr(A)
(n) Bayes Theorem: Pr(A | B) =
Pr(B)
(o) Partition: Nonempty events E1 , . . . , En partition the sample space Ω iff

• E1 , . . . , En are exhaustive: E1 ∪ E2 ∪ · · · ∪ En = ni=1 Ei = Ω, and


S

• E1 , . . . , En are pairwise mutually exclusive: ∀i 6= j, Ei ∩ Ej = ∅


– Note that for any event A (with A 6= ∅, A 6= Ω): A and AC partition Ω

(p) Law of Total Probability (LTP): Suppose A1 , . . . , An partition Ω and let B be any event. Then
Pr(B) = ni=1 Pr(B ∩ Ai ) = ni=1 Pr(B | Ai ) Pr(Ai )
P P

(q) Bayes Theorem with LTP: Suppose A1 , . . . , An partition Ω and let B be any event. Then Pr(A1 | B) =
Pr(B | A1 ) Pr(A1 ) Pr(B | A) Pr(A)
Pn . In particular, Pr(A | B) =
i=1 Pr(B | Ai ) Pr(Ai ) Pr(B | A) Pr(A) + Pr(B | AC ) Pr(AC )
(r) Chain Rule: Suppose A1 , ..., An are events. Then,

Pr(A1 ∩ ... ∩ An ) = Pr(A1 ) Pr(A2 | A1 ) Pr(A3 | A1 ∩ A2 ) ...Pr(An | A1 ∩ ... ∩ An−1 )

You might also like