0% found this document useful (0 votes)
9 views12 pages

Chapter 1

Uploaded by

lajak72529
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views12 pages

Chapter 1

Uploaded by

lajak72529
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

CHAPTER 1

PROBABILITY AND RANDOM VARIABLE

1 INTRODUCTION.
The basic idea of probability deals with the notion of randomness. There
are some real life experiments having uncertain outcomes and can be de-
termined only after performing them a large number of times in the same
identical environment, such experiments are known as statistical exper-
iments. For example drawing a card from a pack of cards, tossing/rolling
an unbiased coin/die, all these experiments have finite number of outcomes.
We can produce experiments having countable or uncountably many pos-
sible outcomes. For example choosing integer from the set of real numbers,
or choosing irrational numbers from a real line.

Let Ω be a set of all possible outcomes of a statistical experiment, i.e.,


each possible outcome of a statistical experiment can be represented by a
point ω ∈ Ω. The set Ω is called sample space.
Mathematically, A sample space Ω is an arbitrary, nonempty set and is
usually postulated as a reference space for a given statistical experiment.
Its elements are referred as points (of the space) and generally denoted by
ω.
Let A be subset of Ω, then A is called an event associated with statistical
experiment. For any reference space Ω, the complement Ac of a subset A
of Ω is defined by Ac = Ω \ A is also an event.
Let A and B are two events, then A ∩ B and A ∪ B are subsets of Ω. Hence
A ∩ B and A ∪ B are also events. It is clear that union, intersection and
complementations of events are corresponds to ‘or’, ‘and’ and ‘negation’.

For defining probability of an event, let us start with the simplest sta-
tistical experiment ‘The Coin Tossing Experiment’. Although we know all
possible outcomes in advance, but we can not predict the outcome of the
next trial with certainty. In such situations we are interested in the proba-
bility of occurrence of a particular outcome.

1
In case of coin tossing experiment, Ω = {H, T }. The probability of
occurrence of head in a given trial is denote by P [H] and defined as
number of favorurable outcome 1
P [H] = = .
total numbre of possible outcomes 2
Similarly,
1
.
P [T ] =
2
Another example, in a die rolling experiment, we have Ω = {1, 2, 3, 4, 5, 6}.
Let A = {1, 3, 5} be an event. Then
1
P [A] = .
2
It is also clear that P [Ω] = 1.

In both the above examples we deal with experiments having finite pos-
sible outcomes, and hence we are able of assign probability to each and
every outcome of the experiment. Also we were considering fair coins and
fair dies, i.e., each outcome have equal chance of occurrence(equiprobable).

At this point I have few questions:

• What will happen? if in some experiment we do not have equiprobable


outcomes.

• What will happen? if in some experiment we have uncountable num-


ber of outcomes.

Assigning probabilities to each outcome will remain that easy in above cases
also?
For assigning probabilities to each outcome in first case, when we do not
have equiprobable outcomes. Again we can consider coin tossing experi-
ment, but this time we do not know whether coin is fair or unfair. For
calculating probability of occurrence of head, we have to toss the coin a
sufficiently large number of times (say n). Then the probability of occur-
rence of head is denoted by P [H] and defined as
frequency of occurrence of H
P [H] = .
Total number of toss
Similarly,
frequency of occurrence of T
P [T ] = .
Total number of toss
Now it is clear that we can assign probability to each outcome of a statistical
experiment with finite outcomes, by conducting the same experiment a large
number of times in the identical conditions.
Answering the second question is not as easy as in the first case. Let us

2
consider following example to answer this question.
Let sample space Ω be a unite square [0, 1]2 , it obviously that all the points
(outcomes) must be equiprobable. Let ω be a point of Ω, i.e., ω ∈ Ω. Let
A be a subset of Ω, then we can calculate probability of event A using the
probability of ω’s in A, i.e,
X
P [A] = P [∪ω∈A ω] = P [ω].
ω∈A

Let A = Ω, then X
P [Ω] = P [ω]. (1)
ω∈Ω

We know that [0, 1]2 , uncountable many outcomes (ω’s,) then the proba-
bility of each outcome is certainly equal to zero, i.e.,

P [ω] = 0, ∀ ω ∈ Ω. (2)

From equation (1) and equation (2) and the fact P [Ω] = 1, we get
X X
1 = P [Ω] = P [ω] = 0 = 0.
ω∈Ω ω∈Ω

This shows that P [ω] = 0, ∀ ω ∈ Ω is not a correct assignment of probabil-


ities. Although it is intuitively clear that the probability of A ⊆ Ω must be
equal to the proportion of outcomes in A. To verify above statement, we
discretized our sample space Ω = [0, 1]2 in to n2 squares of side n1 . Then
the probability of a given outcome belong to a particular square is n12 . This
implies that
n2
X 1
P [Ω] = = 1,
i=1
n2
Also as n → ∞, the probability of a given outcome belong to a particular
square converges to zero and the squares of side n1 converges to a point of Ω.

The above discussion suggests that instead to assigning the probability


to each individual outcome, we should assign probability to the subsets of
Ω.
Let A be a class of subsets of a nonempty set Ω. Let A ⊆ Ω be an event,
then A ∈ A. Let ω be an outcome of an experiment, now if we can answer
the question ‘Does ω ∈ A?’ certainly we can able to answer ‘Does ω ∈ Ac ?’
Again let A1 ⊆ Ω, i = 1, 2, 3, . . . . Now if we are able to answer whether
ω ∈ Ai or not, for all i = 1, 2, 3, . . . . We can determine whether ω ∈ ∪∞
i=1 Ai
or not.
The above reasoning shows that the class(collection of subsets of Ω) to
which we want to assign probabilities must be closed under complements
and countable unions. Hence closed under countable intersections also.
This gives rise to the notion of algebra and σ−algebra.

3
Definition 1.1. A class A of subsets of a nonempty set Ω is called an
algebra(or field) on Ω, if

1. Ω ∈ A.

2. A ∈ A, ⇒ Ac ∈ A.

3. A, B ∈ A ⇒ A ∪ B ∈ A.

Note. The collection {φ, Ω} is an algebra, as it satisfied all the condi-


tions for being an algebra.
Let A ⊂ Ω, then {φ, Ω, A, Ac } is an algebra and known as algebra generated
by subset A.

Example. The family of all subsets of a finite set of Ω is an algebra.

Example. Let Ω = {1, 2, 3, 4}, and define

1. A1 = {φ, Ω, {1}, {2, 3, 4}},

2. A2 = {Ω, {1, 2}, {3, 4}},

3. A3 = {φ, Ω, {1}, {1, 2}, {3, 4}, {2, 3, 4}},

check whether Ai , i = 1, 2, 3 are algebras or not?

Solution.

1. A1 = {φ, Ω, {1}, {2, 3, 4}}, is an algebra. As A1 satisfied all the con-


dition for being an algebra.

2. A2 = {Ω, {1, 2}, {3, 4}}, is not an algebra. Since φ ∈


/ A.

3. A3 = {φ, Ω, {1}, {1, 2}, {3, 4}, {2, 3, 4}}, is not an algebra. Since {1}∪
{3, 4} ∈
/ A3 satisfied all the condition for being an algebra.

Example. Let Ω = (0, 1), and define

1. A1 = {φ, Ω, (0, 1/4), (1/4, 1)},

2. A1 = {φ, Ω, (0, 1/2), [1/2, 1)},

check whether Ai , i = 1, 2, 3 are algebras or not?

Solution.

1. A1 = {φ, Ω, (0, 1/4), (1/4, 1)}, is not an algebra. As (0, 1/4)c ∈


/ A1 .

2. A2 = {φ, Ω, (0, 1/2), [1/2, 1)}, is an algebra. As A2 satisfied all the


condition for being an algebra.

4
Definition 1.2. A class A of subsets of a nonempty set Ω is called an
σ-algebra(or σ-field) on Ω, if
1. Ω ∈ A.
2. A ∈ A, ⇒ Ac ∈ A.
3. A1 , A2 , A3 , . . . ∈ A ⇒ ∪∞
n=1 An ∈ A.

The element of algebra or σ-algebra are called event and the singletons in
algebra or σ-algebra are called simple events.
• Let A be a finite algebra, then A is a σ-algebra.
• Let {Aj }j∈J be any collection of σ- algebras, define on the same set,
then their intersection ∩j∈J Aj is also a σ-algebra.
• The union of two σ-algebras may or may not be a σ-algebra.
Example. Let Ω = {1, 2, 3}, and let
1. A1 = {φ, Ω, {1}, {2, 3}},
2. A2 = {φ, Ω, {2}, {1, 3}},
3. A3 = {φ, Ω, {1}, {2}, {3}, {1, 2}, {2, 3}, {1, 3}},
are three algebras for Ω. Show that union of two algebras is not necessary
an algebra.
Solution. We have A1 ∪ A2 = {φ, Ω, {1}, {2}, {1, 3}, {2, 3}} is not a
σ-algebra, since {1} ∪ {2} = {1, 2} ∈
/ A.
But A1 ∪ A3 = {φ, Ω, {1}, {2}, {3}, {1, 2}, {2, 3}, {1, 3}}, is an algebra.
Definition 1.3. The smallest non-empty event belonging to A are called
atoms.
Example. Let Ω = {1, 2, 3, 4} and F be an algebra defined as
F = {φ, {1}, {2}, {1, 2}, {3, 4}, {2, 3, 4}, {1, 3, 4}, Ω}.
Find out atoms for F.

Solution. Atoms for F are {1}, {2} and {3, 4}. Element {3, 4} is also
an atom, since we can not write {3, 4} as the union of any other element of
F.

Example. Let Ω = {1, 2, 3, 4} and F be an algebra defined as


F = {φ, (0, 1), (0, 2/3), [2/3, 1)}.
Find out atoms for F.

Solution. Atoms for F are (0, 2/3) and [2/3, 1).

5
Lemma 1.1. Show that different atoms of a σ-algebra A must be disjoint.

Proof. Let A and B are atom of a σ-algebra A. Then A ∩ B ∈ A. Since


A ∩ B ⊂ A and A is an atom, A ∩ B = φ or A ∩ B = A. Similarly, A ∩ B = φ
or A ∩ B = B. Hence either A = B or A ∩ B = φ.

Definition 1.4. A collection M of subsets of Ω is a monotone class,


if An ∈ M, n = 1, 2, . . . , together with An ↑ A or An ↓ A, implies that
A ∈ M.

In other words, A collection M of subsets of Ω is a monotone class, if


we have an increasing sequence of sets A1 ⊆ A2 ⊆ A3 ⊆ . . . with Ai ∈ M,
then the limit of this sequence is in M. Similarly, if we have a decreasing
sequence of sets A1 ⊇ A2 ⊇ A3 ⊇ . . . with Ai ∈ M, then the limit of this
sequence is in M.

Monotone classes are closely related to σ-algebras, and can be use to


verify whether a given collection is a σ-algebra or not.

Lemma 1.2. A necessary and sufficient condition for an algebra A to be


a σ-algebra is that it is a monotonic class.

Proof. Let A be a σ-algebra and An ∈ A. Let A = ∪∞ n=1 An is a


countable union of sets of A, then ∪ni=1 Ai ↑ A. Similarly if A = ∩∞
n=1 An is
a countable intersection of sets of A, then An ↓ A. Hence A is a monotone
class.
Now let A be a monotone class and An ∈ A, n = 1, 2, . . . It is clear that
Bn = ∪ni=1 Ai ∈ A and Bn ⊂ Bn+1 . Consequently, by the definition of a
monotonic class, Bn ↑ ∪∞ ∞
i=1 Ai ∈ A. Similarly we could show that ∩i=1 Ai ∈
A.

2 PROBABILITY MEASURE.
Definition 2.1. If A is a σ-algebra relative to Ω, then the pair (Ω, A) is
called a measurable space. The elements of A are called measurable
sets.

Definition 2.2. A non-negative, σ−additive, real valued function µ on a


class A containing φ with µ(φ) = 0, is called a measure.

Definition 2.3. If µ, is a measure on a σ−algebra A of subsets of Ω, then


the triplet (Ω, A, µ) is called a measure space.
A measure space (Ω, F, P ) is a probability space, if P (Ω) = 1; and
the measure P is called a probability measure or probability.

6
Definition 2.4. Let A be an algebra of subsets of Ω. A non-negative real
valued set function µ = µ(A), A ∈ A, is called a finitely additive mea-
sure defined on A, if

µ(A ∪ B) = µ(A) + µ(B).

Definition 2.5. Let A be an algebra of subsets of Ω. A finitely additive


measure µ = µ(A), A ∈ A, define on algebra A is σ-additive mea-
sure(countably additive or completely additive), or simply a measure, if for
all pairwise disjoint subsets A1 , A2 , . . .

X
µ(∪∞
n=1 An ) = µ(An ).
n=1

A σ-additive measure µ is said to be a σ-finite if Ω can be represented


in the form
X∞
Ω= Ωn , Ωn ∈ A,
n=1

where µ(Ωn ) < ∞, n = 1, 2, . . .

3 RANDOM VARIABLE.
Definition 3.1. The element of the σ-algebra B generated by the class of
finite intervals (x, y), −∞ < x, y < ∞ are known as the Borel sets(of the
line) or linear Borel sets or Borel set in R. The measurable space (R, B) is
called the Borel line or 1−dimensional Borel space.

Note. Let B be a σ−algebra contains all finite intervals (x, y), −∞ <
x, y < ∞, then it must contain interval of following froms:
• (x, y], since we can rewrite (x, y] = ∩∞ 1
n=1 (x, y + n ) and σ−algebra is
closed under countable intersections.

• [x, y), since we can rewrite [x, y) = ∩∞ 1


n=1 (x − n , y) and σ−algebra is
closed under countable intersections.

• [x, y], since we can rewrite [x, y] = ∩∞ 1 1


n=1 (x − n , y + n ) and σ−algebra
is closed under countable intersections.

• (x, ∞), since we can rewrite (x, ∞) = (x, y) ∪ (∪∞


n=0 [y, y + n)) and
σ−algebra is closed under countable unions.

• [x, ∞), since we can rewrite [x, ∞) = ∩∞ 1


n=1 (x − n , ∞) and σ−algebra
is closed under countable unions.

• (−∞, y), since we can rewrite (−∞, y) = ∪∞n=1 ((x − n, x]) ∪ (x, y) and
σ−algebra is closed under countable intersections.

7
• (−∞, y], since we can rewrite (−∞, y] = ∪∞ 1
n=1 (−∞, y+ n ) and σ−algebra
is closed under countable intersections.

• {x}, since we can rewrite {x} = (−∞, x) ∩ [x, y) and σ−algebra is


closed under intersection.

This implies that all type of interval and singalton begolns to B.

Definition 3.2. A real valued function X = X(ω) defined on (Ω, F) is a


F−measurable, or Borel-measurable, or random variable, if

{ω : X(ω) ∈ B} ∈ F, for every B ∈ B;

or equivalently; if the inverse image

X −1 (B) ≡ {ω : X(ω) ∈ B}, is a measurable set in Ω.

A necessary and sufficient condition that a function X = X(ω) is F−


measurable is that {ω : X(ω) ∈ B} ∈ F for all B ∈ B.

The simplest example of a random variable is the indicator function


IA (ω) of an arbitrary(measurable) set A ∈ F.

Example. For a coin tossing experiment, the sample space of possible


outcomes is Ω = {H, T } (for heads and tails). Then F = {φ, Ω, {H}, {T }}
be an algebra for this experiment. Let X(ω) be defined as

2, ω = H;
X(ω) =
10, ω = T.

Show that X(ω) is a random variable defined on (Ω, F).

Solution. For being a random variable X(ω) must satisfied the follow-
ing condition,
{ω : X(ω) ∈ B} ∈ F, ∀ B ∈ B.
We have
{ω : X(ω) = 2} = {H} ∈ F.
{ω : X(ω) = 10} = {T } ∈ F.
Hence X(ω) is a random variable defined on (Ω, F).

Example. Let Ω = {1, 2, 3} be a sample space and F = {φ, Ω, {1}, {2, 3}}
be an algebra of Ω. Show that X(ω) = a(constant) is a random variable
defined on (Ω, F).

8
Solution. For being a random variable X(ω) must satisfied the follow-
ing condition,
{ω : X(ω) ∈ B} ∈ F, ∀ B ∈ B.
We have
{ω : X(ω) = a} = Ω ∈ F.
Hence X(ω) is a random variable defined on (Ω, F).

Example. Let X(ω) is constant function. Prove that X(ω) is always


a random variable with respect to every algebra or σ−algebra.

Solution. Given X(ω) = a, were a is some constant. Then

{ω : X(ω) = a} = Ω.

We know that Ω is necessarily belong to every F. Hence X(ω) is always a


random variable irrespective of algebra or σ−algebra.

Example. Let Ω = {1, 2, 3, 4} and F = {φ, Ω, {1}{2, 3, 4}} Does


X(ω) := ω + 1, is a random variable with respect to σ−algebra F? If
not give an example of a non-constant function which is a random variable.

Solution. Since

{ω : X(w) = w + 1} ∈
/ F, ω = 2, 3, 4,

X(ω) = ω + 1 is not a random variable.

For a real valued function Y (ω) to be random variable, it must be con-


stant over every atom of the algebra or σ−algebra.
In this example we have two atom {1} and {2,3,4}. Now if we define a real
valued function X(ω) such that X(1) = a, where a ∈ R, and X(ω) = b,
where b ∈ R, for ω ∈ {2, 3, 4}. Then X(ω) is a random variable with respect
to (Ω, F).

Example. Let Ω = {−3, −2, −1, 0, 1, 2, 3} find the smallest σ−algebra


for which the following function would be random variable.
1. X(ω) = ω 2 .

2. X(ω) = ω + 1.
Solution. We know that a random variable have a constant value over
every atom of algebra and σ−algebra.
1. Atoms for X(ω) = ω 2 , are {0}, {−1, 1}, {−2, 2} and {−3, 3}. Then
the σ−algebra generated by these atoms is the smallest σ−algebra
on which random variable X(ω) = ω 2 can be define.

9
2. Atoms for X(ω) = ω + 1, are {−3}, {−1}, {−2}, {0}, {1}, {2} and
{3}. Then the σ−algebra generated by these atoms is the smallest
σ−algebra on which random variable X(ω) = ω + 1 can be define.
Example. Let Ω = [0, 1], and F be a σ−algebra of Borel set of [0, 1].
Check whether X(ω) = ω is a random variable on (Ω, F)?

Solution. Since X(ω) = ω, ∀ω ∈ Ω, Then



 φ, ω < 0;
X −1 [0, ω] = ω, 0 ≤ ω < 1;
Ω, ω ≥ 1.

This implies that X(ω) = ω is a random variable on (Ω, F).

4 FUNCTION OF RANDOM VARIABLE.


Example. Let X be a random variable on (Ω, F). Show that Y (ω) =
aX(ω) + b is a random variable on (Ω, F), where a, b are constants.

Solution. We have Y (ω) = aX(ω) + b and X is a random variable on


(Ω, F). Then,
Y −1 (i, j) = {ω : Y (ω) ∈ (i, j)}
= {ω : aX(ω) + b ∈ (i, j)}
  
i−b i−b
= ω : X(ω) ∈ , ∈ F,
a a

since X(ω) is a random variable on (Ω, F). Hence Y is a random variable


on (Ω, F).

Example. Let X be a random variable on (Ω, F). Then X + defined as



+ X(ω), X(ω) > 0;
X (ω) =
0, X(ω) ≤ 0,
also a random variable on (Ω, F).

Solution. We can write


X + (ω) = X(ω)I{X(ω)>0} .
Then,
−1
X + (−∞, a) = {ω : X + (ω) ≤ a}
= {ω : X(ω)I{X(ω)>0} ≤ a}
= {ω : 0 < X(ω) ≤ a} ∈ F,

10
since X(ω) is a random variable on (Ω, F). Hence X + (ω) is a random vari-
able on (Ω, F).

Example. Let {Xn }n≥1 be a sequence random variables on (Ω, F).


Let Y = max{X1 , X2 , . . . , Xn }. Then show that Y is a random variable on
(Ω, F).

Solution. Since Xi , i = 1, 2, . . . , n are random variables on (Ω, F).


Then we have
X −1 (−∞, x) ≡ {ω : X(ω) ≤ x}
Now

Y −1 (−∞, y) ≡ {ω : Y (ω) ≤ y}
= {ω : max{X1 (ω), X2 (ω), . . . , Xn (ω)} ≤ y}
= {ω : X1 (ω) ≤ y, X2 (ω) ≤ y, . . . , Xn (ω) ≤ y}
= {ω : ∩ni=1 {Xi (ω) ≤ y}}
= ∩ni=1 {ω : {Xi (ω) ≤ y}} ∈ F,

since Xi , i = 1, 2, . . . , n are random variables on (Ω, F) and F is closed


under countable intersections.

Example. Let X be a random variable on (Ω, F). Than any Borel


measurable function fX (ω) of X(ω), (i.e., f : R → R) is also a random
variable on (Ω, F).

Solution.
−1
fX (B) ≡ {ω : fX (ω) ∈ B}
= {ω : X(ω) ∈ f −1 (B)}
= {ω : X(ω) ∈ B1 } ∈ F, since f is Borel measurable.

Hence fX (ω) is random variable on (Ω, F).

11
Exercise.
1. Let F be an algebra. show that φ ∈ F.

2. Let F be an algebra and let A and B belongs to F. Show that A \ B


is also an event, i.e., A \ B ∈ F.

3. Let F be an algebra and let A and B belongs to F. Show that sym-


metric difference A∆B is also an event, i.e., A \ B ∈ F.

4. Show that power set of a set Ω is always an algebra. Let Ω = [0, 1]


and Fi , i = 1, 2, 3 be are collections of subset of Ω and defined as
follows:

a F1 = {φ, Ω, {1}, Ω \ {1}},


b F2 = {φ, Ω, {1} ∪ {0}, (0, 1)},
c F3 = {φ, Ω, {1}, {0}, (0, 1)}.

Check whether Fi , i = 1, 2, 3 are algebras or not.


1
5. Let F be a σ−algebra on Ω = [0, 1] such that [ n+1 , n1 ] ∈ F, n =
1, 2, . . . . Show that

a. {0} ∈ F.
b. ( n1 , 1] ∈ F, n = 1, 2, . . . .

6. Let I be set of integers, and F be a collection of subset of Ω, such


that F = {A ⊆ Ω : A is a finite set}. Check whether F is an algebra
or not. If yes, then again check for σ− algebra.

7. Let X be a random variable defined on (Ω, F). Show that Y (ω) =|


X(ω) | is a random variable on (Ω, F).

8. Let X and Y are two random variables defined on (Ω, F). Show that
Z(ω) = X(ω) + Y (ω) is a random variable on (Ω, F).

9. Let {Xn }n≥1 be a sequence random variables on (Ω, F). Let Y =


min{X1 , X2 , . . . , Xn }. Then show that Y is a random variable on
(Ω, F).

10. Let X and Y are two random variables defined on (Ω, F). Define for
A∈F 
X(ω), ω ∈ A;
Z(ω) :=
Y (ω), ω ∈ Ac .
Show that Z(ω) is also a random variable defined on (Ω, F).

12

You might also like