0% found this document useful (0 votes)
33 views

Lecture 1

(1) Probability is a branch of mathematics that deals with describing how likely events are to occur numerically. (2) A random experiment is defined as an experiment where the outcomes are unknown, but the sample space of all possible outcomes is known. The sample space is the set of all possible outcomes. (3) Probability is defined as a function that assigns a number between 0 and 1 to each event, where an event is a subset of the sample space. This function must satisfy three axioms.

Uploaded by

Anushka Vijay
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views

Lecture 1

(1) Probability is a branch of mathematics that deals with describing how likely events are to occur numerically. (2) A random experiment is defined as an experiment where the outcomes are unknown, but the sample space of all possible outcomes is known. The sample space is the set of all possible outcomes. (3) Probability is defined as a function that assigns a number between 0 and 1 to each event, where an event is a subset of the sample space. This function must satisfy three axioms.

Uploaded by

Anushka Vijay
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Probability

Probability is the branch of mathematics dealing with numerical description of how


likely something is to happen.
In other words, probability is the measure of the likelihood that an event will occur.
Definition 1. (1) A set E is said to be countable if there exists a bijective map from
the set of natural numbers N to E.
(2) A set E is said to be uncountable if it is neither finite nor countable.
Example 2. (1) Define f : N −→ N by f (n) = n. Clearly f is one -one and onto.
Thus, N is countable.
(2) Let Z denotes the set of integers. Define f : N −→ Z by
(
n−1
2
, if n is odd
f (n) = n
− 2 , if n is even.
Clearly f is one-one and onto. Thus, Z is countable.
(3) The set of all rational numbers Q is also countable. Prove!
(4) The set of real numbers R as well as intervals (excluding one point set) in R are
uncountable. Prove!
Definition 3 (Random experiment). A random experiment is an experiment in which

(1) the set of all possible outcomes of the experiment is known in advance;
(2) the outcome of a particular trial of the experiment cannot be predicted in advance;
(3) the experiment can be repeated under identical conditions.
Definition 4 (Sample Space). The set of all possible outcomes of a random experiment
is called the sample space. We will denote the sample space of a random experiment by
S. For example:

(1) For tossing a fair (unbiased) coin, the sample space S is {H, T }, where H means
that the outcome of the toss is a head and T means that it is a tail.
(2) For rolling a fair die, the sample space S is {1, 2, 3, 4, 5, 6}.
(3) For simultaneously flipping a coin and rolling a die, the sample space S is {H, T }×
{1, 2, 3, 4, 5, 6}.
(4) For flipping two coins, the sample space S is {(H, H), (H, T ), (T, H), (T, T )}.
(5) For rolling two dice, the sample space S is {(i, j) : i, j ∈ {1, 2, 3, 4, 5, 6}}.
Definition 5 (σ-algebra). A non-empty collection F of subsets of S is called a σ-algebra
(or σ-field) if

(1) S ∈ F;
(2) A ∈ F ⇒ Ac ∈ F;
(3) A1 , A2 , . . . ∈ F ⇒ ∪∞
i=1 Ai ∈ F.

Event and Event space: An event is a subset of the sample space S. We say that the
event E occurs when the outcome of the random experiment lies in E.

In general, any subset of S is not necessarily an event, rather an event is a special


subset. The event space (set of all events), denoted by Σ, is a subset of the power set of
S. An event space must be a σ-algebra.

In the next remark the event space will be fixed for different sample spaces. This will
be used throughout the course.
1
Remark 6. (1) If the sample space S is a finite or a countable set, then
we will take Σ = P(S), where P(S) is the power set of S.
(2) Let BR denote the set which contains all open intervals, closed intervals, countable
unions of open intervals, countable unions of closed intervals, countable intersec-
tions of open intervals, and countable intersections of closed intervals.
If the sample space S = R, then we will take the event space Σ = BR
(3) Let I be any interval. Let BI denote a set which contains all open intervals con-
tained in I, all closed intervals contained in I, all countable unions of open in-
tervals contained in I, all countable unions of closed intervals contained in I, all
countable intersections of open intervals contained in I, and all countable inter-
sections of closed intervals contained in I.
If the sample space S = I, then we will take the event space Σ = BI .

For any two events E and F , the event E ∪ F consists of all outcomes that are either
in E or in F , i.e., the event E ∪ F will occur if either E or F occurs. The Event E ∩ F
consists of all outcomes which are both in E and F , i.e., the event E ∩ F will occur if
both E and F occur.
Mutually exclusive events: Two events E1 and E2 are said to be mutually exclusive
if they cannot occur simultaneously, i.e., if E1 ∩ E2 = ∅.
Similarly, we can define union and intersection of more than two events.

For any event E, the event E c (complement of E) consists of all outcomes in the
sample space S that are not in E, i.e., E c will occur if E does not occur. For example, let
E = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}, i.e., E is the event that the sum of the dice
is equal to seven, then E c will occur if the sum of the dice is not equal to seven.
Definition 7. Consider a random experiment with sample space S. For each event E,
we assume that a real number P (E) is assigned which satisfies the following three axioms:

(1) 0 ≤ P (E) ≤ 1 or simply P (E) ≥ 0, for all events E;


(2) P (S) = 1;
(3) If E1 , E2 , . . . is a countably infinite collection of mutually exclusive events, that is,

Ei ∩ Ej = ∅ for i 6= j, then P (∪∞
P
i=1 E i ) = P (Ei ).
i=1

The real number P (E) is known as the probability of the event E.


Remark 8. It is clear that P is a function from the events space Σ to [0, 1] which satisfies
the axioms (1), (2) and (3). We will call P a probability function and the triple
(S, Σ, P ) is called the probability space.
Example 9. (1) Let S = {1, 2, 3, . . .}. Define P on P(S) as follows:
1
P (i) = i , i = 1, 2, . . . .
2
Then P defines a probability (verify this!).
(2) Let S = (0, ∞). Define P on B(0,∞) as follows: For each interval I ⊂ S
Z
P (I) = e−x dx.
I
Then P defines a probability (verify this!).
(3) Let S = [0, 1]. Define P on B[0,1] as follows: For each interval I ⊂ S
P (I) = length of I.
2
Then P defines a probability (verify!).

Assigning Probabilities:

(1) Suppose S is a finite set containing n elements. Then it is sufficient to assign


probability toP each event containing single element. Thus for any events E, we
have P (E) = w∈E P (w). One such assignment is the equally likely assignment or
the assignment of uniform probabilities. According to this assignment, P (w) = n1 ,
for every w ∈ S and P (E) = number of elements
n
in E
.
(2) If S is a countable set, one can not make an equally likely assignment of probabil-
ities. It suffices to make the assignment
Pfor each event containing single element.
Then for any event E, define P (E) = w∈E P (w).
(3) If S is an uncountable set, then again one can not make an equally likely assign-
ment of probabilities.
Theorem 10. Let (S, Σ, P ) be a probability space. Then

(1) P (∅) = 0;
n
(2) For mutually exclusive events E1 , E2 , . . . , En , we have P (∪ni=1 Ei ) =
P
P (Ei );
i=1
(3) P (E c ) = 1 − P (E);
(4) For E1 ⊆ E2 , we have P (E1 ) ≤ P (E2 ) and P (E2 − E1 ) = P (E2 ) − P (E1 );
(5) P (E1 ∪ E2 ) = P (E1 ) + P (E2 ) − P (E1 ∩ E2 ).

Proof. (1) Let E1 = S and Ei = ∅, i = 2, 3, . . .. Then P (E1 ) = 1, E1 = ∪∞


i=1 Ei and
Ei ∩ Ej = ∅ for i 6= j. Therefore
1 = P (E1 ) = P (∪∞
i=1 Ei )
X∞
= P (Ei )
i=1

X
=1+ P (∅)
i=2

X
⇒ P (∅) = 0
i=2

P
This shows that the constant series P (∅) converges to 0. Hence, P (∅) = 0,
i=2

P
otherwise the constant series P (∅) can not be convergent.
i=2

(2) Let Ei = ∅, i = n + 1, n + 2, . . .. Then Ei ∩ Ej = ∅ for i = 6 j and P (Ei ) = 0,


∞ n
i = n + 1, n + 2, . . .. Therefore, P (∪ni=1 Ei ) = P (∪∞
P P
i=1 E i ) = P (Ei ) = P (Ei )
i=1 i=1
(since P (Ei ) = 0, i = n + 1, n + 2, . . .).

(3) Since E ∪ E c = S and E ∩ E c = ∅, 1 = P (E ∪ E c ). Thus 1 = P (E) + P (E c ) (by


using (2)). Hence P (E c ) = 1 − P (E).

(4) Since E2 = E1 ∪ (E2 − E1 ) and E1 ∩ (E2 − E1 ) = ∅, P (E2 ) = P (E1 ∪ (E2 − E1 )) =


P (E1 ) + P (E2 − E1 ). This implies that P (E2 − E1 ) = P (E2 ) − P (E1 ).

3
(5) Since E1 ∪ E2 = E1 ∪ (E2 − E1 ) and E1 ∩ (E2 − E1 ) = ∅, P (E1 ∪ E2 ) = P (E1 ∪
(E2 − E1 )) = P (E1 ) + P (E2 − E1 ) . . . (i)
Also since E2 = (E1 ∩ E2 ) ∪ (E2 − E1 ) and (E1 ∩ E2 ) ∩ (E2 − E1 ) = ∅, P (E2 ) =
P (E1 ∩ E2 ) + P (E2 − E1 ) ⇒ P (E2 − E1 ) = P (E2 ) − P (E1 ∩ E2 ) . . . (ii)
Thus, by equation (i) and (ii), we have
P (E1 ∪ E2 ) = P (E1 ) + P (E2 ) − P (E1 ∩ E2 )

Inclusion-exclusion identity: For events E1 , E2 and E3 we have


P (E1 ∪ E2 ∪ E3 ) = P ((E1 ∪ E2 ) ∪ E3 ) = P (E1 ∪ E2 ) + P (E3 ) − P ((E1 ∪ E2 ) ∩ E3 )
= P (E1 ) + P (E2 ) + P (E3 ) − P (E1 ∩ E2 ) − P ((E1 ∩ E3 ) ∪ (E2 ∩ E3 ))
= P (E1 ) + P (E2 ) + P (E3 ) − P (E1 ∩ E2 ) − P (E1 ∩ E3 ) − P (E2 ∩ E3 )
+ P (E1 ∩ E2 ∩ E3 )
Inductively, for any n events E1 , E2 , . . . , En , we have
Pn P n
P
P (E1 ∪ E2 ∪ · · · ∪ En ) = P (Ei ) − P (Ei ∩ Ej ) + P (Ei ∩ Ej ∩ Ek ) − · · · +
i=1 i<j i<j<k
(−1)n+1 P (E1 ∩ E2 ∩ · · · ∩ En ).

This identity is known as the inclusion-exclusion identity.


Exhaustive events: The countable collection {Ei | i ∈ ∧} of events is said to be
exhaustive if P (∪i∈∧ Ei ) = 1, where ∧ is an index set.
Definition 11. Let (S, Σ, P ) be a probability space and (En ) be a sequence of events in
Σ.

(1) We say that sequence (En ) is increasing (written as En ↑) if En ⊆ En+1 , n =


1, 2, . . . ;
(2) We say that sequence (En ) is decreasing (written as En ↓) if En+1 ⊆ En , n =
1, 2, . . . ;
(3) We say that the sequence (En ) is monotone if either En ↑ or En ↓.
(4) If En ↑, we define Limn→∞ En = ∪∞ n=1 En
(5) If En ↓, we define Limn→∞ En = ∩∞ n=1 En

Theorem 12. (Continuity of Probability) Let (An ) be a sequence of monotone events.


Then
P (Limn→∞ En ) = lim P (En ),
n→∞
where lim P (En ) denotes the limit of the real sequence (P (En )).
n→∞

Proof. Exercise 

You might also like