0% found this document useful (0 votes)
10 views22 pages

Chapter 9

This chapter discusses the concepts of sequences, bounds, and limit theorems in the context of random variables and their convergence. It introduces important inequalities such as Markov's and Chebyshev's, and explores convergence in probability, almost everywhere, and in mean square sense. Various examples illustrate the behavior of sequences of random variables and their probability density functions.

Uploaded by

bhavik.desai23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views22 pages

Chapter 9

This chapter discusses the concepts of sequences, bounds, and limit theorems in the context of random variables and their convergence. It introduces important inequalities such as Markov's and Chebyshev's, and explores convergence in probability, almost everywhere, and in mean square sense. Various examples illustrate the behavior of sequences of random variables and their probability density functions.

Uploaded by

bhavik.desai23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

CHAPTER Sequences, Bounds

and Limit Theorem


9

1. Introduction variables, their convergence .

of random
shall study sequences
Markov's inequality y ancl
In this chapter first
we
viz., Chebychev's inequality,
inequalities viz. the central limit theor
shall study three important theorem in Statistics
and
the most important
inequality. Then we shall study
applications.

2. Convergence In Probability individually. In the next chapters, we


wehha
the behaviour of a random variable
So far we have studied their behaviour we need a new Concente
variables collectively. To discuss
to deal with a set of random
o r stochastic
convergence".
"convergence in probability
consider first the following example.
the convergence in probability
To understand
function f() = 1 for 0s zs
Example: A continuous random variable Z has the probability density
conditions of a probability density
function and find the probabio
Prove that it satisfies the
0-7. Interpret your result.
() 0-1<z< 0:3, (i)
0-2 <z<
Solution: Since f(z) =
1, f(2)>0 for all zin [0, 1].

Further, r ) dz- 1.dz =[l -1

function.
1 satisfies the conditions of a probability density
Hence, f(2) =
4
0-2
Now, P(0-1<z< 0-3) dz=[o O3-0-1 0
Fig. 9.1
0-5
and P(0-2<Z< 0-7) =
d z =[2]0=0.7- 0-2 =

Consider the line segment OA [0, 11.


is 02. ve
In () in the subinterval [ 0:1, 03] and the length of the subinterval Simila
z lies
probability that a point z lies in the sub-interval [ 01,0-3] is equal to the length of the sub-iha legm
we see from (i) that the probability that the point lies in the sub-interval [0:2, 0-7] is equa

the sub-interval 0-5. the s

ne
Thus, if the probability that a sub-interval of[ 0, 1 ] is equal to
point lies in a
interval then the probability density function is given by f(z) = 1, 0szs1.
xampleA
(a) Sequence of Random Variables
Let Z be a random variable having the probability density function given in the xamp
f(z) = 1, 0 SzS1. Then we can define a number of random variables in terms or 2
define a new random variable V in terms of z by the relation
n 1,2, 3, 4,.
p-d1-)
1, =0

n2 V2
n=3, =z

n=4, V >z
4

Thus, we get a sequence of random variables V, v. Va

robability
ep
density functions of these random variables can be obtained as studied in
Chapter 7,
see Ex. 2, page 7-9.
hcularly
forxample,considerthe transformation V Then fv,(2) = l{2)2

Now, by data fzlz) = 1 and dz 2 v (Ve)=1:2 -2.


dV2

Now,when Z=0, V2= 0, Z= 1, V2 =

probability density function of V2 is v, 2, Osv2 S 1


Similarty, we can obtain the probability density functions of other random variables.
99
For n 100,
V10010 and, voV100)0 <
V100100
100
ASn-co, we get fy (V) = 1 in 0 <V< 1.
,a the probability density function of V(2) = zis a uniform probability density function

uence of random variables


as sequence of functions
nl as functions of z: As z takes different values in [ 0, 1, 1
2

. .take different values in the specified intervals and


e themselves functions
ons of z. These functions are shown in the Fig.
hus, the V(2)=
sequence
ence of functions. of random variables can be considered as a

undper be) as asequencee ofreal number: Alternatively, O


we may select
eWeen f0, 1] at random say 0-8, then
For
Fig. 9.2
n 1, V 0
ns2, V2==0-4

n=3,
Vg-(0-8) 0-5
Random Signal Analysis
(9-3) Sequences,
nds ondy
4, V(0-8)-0-6
' ' ' ' .

sequence of real
Thus, for a given zin [0, 1], we get a
particular value of z =
Z1: for
numbers determined by a
different values of z, we get different sequences.
to a
This sequence of real numbers corresponding
particular z1 is shown in the Fig. 9.3. o1 3 4

Sequence of real numbers determined by z Fig. 9.3


(ii) Convergence of Real Numbers From this figure, we see that as n becomes

100
larger
larger i.e., as n o the sequence 0.
4 10711 approac
Z and in
limit
becomes Z1.
Thus, we say that the sequence of real numbers V,(z) converges to z1. See Fig. 9.3.

100
It we take another value z2, say

This sequence
we get another sequence 0,
aa
as n co
will approach to zZ2

(iv) Convergence of functions: Consider fh(x) sin =

For n 1, h ) = sin (w+ 1) x

n 2, )- sin w
n= 3, fa(x)= sin w+

It is easy to
see that as n o for any fixed value ofx
say X1.
ax) sin wx as 0
n
If xchanges to
is called
x then flxe)-> sin Wx2 and this is true for any value of rgend

pointwise converges x(any point). I nis

Definition: The sequence


f, (x)
no) converges to f(xo). converges (pointwise) to the function f(x) it for eacn Ag
uend
of numbers

(v) Convergence of Random Variables: If now


consider the sequence of
we
funcu
Vl2)V,(z)= z,..., Vioo(2)=00
101 . . Wnere z is a function taking values in [ 0, }, V,(
approach z as n - o. [ See Fig. 9.2]
Thus, we see that the sequence of random 0achtot n

random variable Z as n-o variables V(z), V(z), Will


..., V,(z), ...
app
(9-4) Sequences, Bounds and L. T.
andomSignal
Analysis
lysis

we now
fine the convergence of a sequence of random variables
defin
xample,
Fiomthis e
eequence of random variables Ä1, A2 Xn . is said to converge in the probabilistic
>0.
if foranyE
Definition:

c
constant

sense
toa lim P{IXn -c| < ¬) =1

to
lim PIX -cl> e} =0.
equivalent
no
This is
stochastic convergence.
This s
also called

also
denoted in simple form as X, C as n co
This is
onverges to
c in babilistic sense as n - o.)

p) Convergence Everywhere

nefinition: Consider a sequence of random variables {Xn (7) }. If the sequence of functionsXn (2)
es tothe function X() for all z in S as n > ie., if X, (2)>X(2) for all
o S
zin as n , then we

eavthat the sequence ( X,(Z)} converges everywhere to the random variable X (2).

Example:LetZbe a random variable in S = [0, 1] defined as in the example given on page 9-1 i.e.,
he probability that Z lies in an interval is equal to the length of the interval.
Consider the following sequences of random variables.

0 U2)=n=1,2,3,..

)V,1Z)=Z1 n 1,2, 3,.


Discuss the convergence of U, and Vp

SOlution:()Since as n , 0 whatever may be Z, the sequence U,(Z)>0 everywhere in


S=[0.11
n this case all sample sequences i.e., the sequences for different values of Z converge to the same
Value zero.

(1)Since as n^ o, Z 1 z for all Z in S = [ 0, 1], the sequence V,(Z) converges to Z


/crywhere in S= [0, 11
In this
nis case diferent sample sequences i.e., the sequences for different values of Z say Z1, 22 3
aerge to different values Z1, Z2,
Z3
onvergence Almost Everywhere (A.E.)
Definition Consider aa sequence of random variable {Xn(2) }. If the sequence of functions X, (z)
wonverg X(2) for all and if P[X, (z) X(2) ]=1 then > say that the as n > o we
Uenca nction z a s n-> co

e{X,(2)} converges almost everywhere to X(z).


ha
bal isample:Considert
Onsider the following experiment. An
contains 2 red balls and 2 yellow balls. At time
urn

drawr the number of balls of this colour is


der andom from the urn and its colour is noted. If
s kept than the number of alls of the other colour then the ball is put back in the un; otherwise the ball
ball
out. Let X,(z) denote the number of red balls in the urn after n th draw.
Discuss treConvergence of the sequence of random variables.
Sequences,
(9-5) ounds and
Random Signal Analysis Now at any othe
and kept ut.
out.
draw if
in the first draw
a red
ball is
is greater
selected

than the
n u m ber
b e of red balls yellou
), the yelowt
uppose (= 2) ball is drawn. But.
Solution:
the n u m b e r of
yellow
balls
for every
draw it a
yellow
and
ntually for s
IS drawn
since
urn. This
will c o n t i n u e ball will certainly
be drawn. X,(z)>0 as n.
in the red
ball is drawn then by
will be put the the
n oo, ne
w e will geta red ball. Thus,

On the
as

other hand if at
the first draw a yellow
2 with probability
X, (2)
one. reasoning
probability is 1.
=

ball and
another yellow probability that will h X(2) be
n we shall get
with probability
one. The equal to 0or
asn->

Thus, X,(z)-> X(2) everywhere.


> X(z) almost
one. In other words X, (2)

Convergence
(d) Mean Square If E[{ X, (2)- X(z)} ]->0 as n
variable { X, (z) }. We say
Consider a sequence of random to X(2) in mean square sense.
the sequence of random
variable { Xn(z)} converges
the
by limit in
mean.

We denote mean square convergence

L.i.m. Xn(z) > X(z) as no,

random from the interval S= [0, 1]. Let the probahiih


Example: Let zbe a number selected at sub-interval. Let us define a new sequence of ran
z lies in a sub-interval be equal to the length of the
variables by

Vpl)-z1 n=1,2,3,..
Show that the sequence { V,(z)} converges in mean square sense.

Solution: We see that as n oo, V^(z) =0, V2(2) =. Val)=z,Val2)=| n


V2) approaches z as n-> o,
0<Z<1
We have seen
that the probability density of zis
Thus, theprobability density function of V, (2)
(2)0,elsewhere
in the limit is also a uniform density function in
Now, consider El {V,
(2)-z}¥.
Now, Vpz)- z z=
Since z has the uniform
probability density function in the interval [0, 1].
ElV,(2)-z?]= B
Asnthis limit tends to zero i.e., the
tends to zero. Hence, { V,(z)) mean square error
in converges mean square sense.
(e) Convergence in Probability
Definition:A sequence of random variable
to converge in
probability to the random variable {X, (z)) is said Zot
E 0 , we have X (2), if for
any
PX,(2)-X(2) |> e
no ]->0 as
The Fig. 9.4 illustrates the
the limiting random variable X convergence in
probability where
(z) is a constant Zo
Fig. 9.4
R a n d o mS i g n a l Analysissis (9-6) Sequences, Bounds and L. T.

ve that
ample: Prove that if a sequence of random variables ( X,(2)} converges in mean square sense
converges in probability.
thenthe s e q u e n

eeauence of random variable{Xn (2)} converge to the random variable X (2) in the mean
sequence

Solution
:Letthe
squaresense.

Markov inequality
Now, by

PX,- X> el- P(x, xP>


-

e2]« ELX,-
E2
X) 1Seo
[See page 9-13]
the Sequence ( Xn (2)) Converges in mean square sense then the right hand side tends to zero.

P[IX-X]>e ] - 0as n
in
The sequence{Xn (2) } converges probabilitv.
Convergence in Distribution
Definition: A sequence of random variables { X, (2)}is said to converge in distributionif the sequence
sthei distributionfunction F ) ) converges to a distribution function F (2) for all z at which F (2) is
continuousi.e., if Fn(2) F(2) as n o,

The central limit theorem is an example of convergence in distribution. (See Central limit theorem
below, page 9-17.)
HX Xt X2 t....+ Xn where E (X) = u and Var (X) = a, i= 1,2,.n,.. then by central limit
theorem the distribution of X tends to normal distribution with and standard deviation
mean u o/vn.

3. Bounds of Probabilities (Chebychev's Inequality,


Markov's Inequality, Chernoff's Inequality)
f we know probability distribution of a random variable, we can calculate E (X) and Var (X) if these
ES. But from the knowledge of these measures we cannot find the probability distributions or calculate

Eprobability that x= a, where a is a given constant.


Although we cannot find such probabilities we can find the bounds within which these probabilities lie
ang Chebychev's inequality. In other words, Chebychev's inequality gives us bounds on the probability
random variable can deviate from its mean value X. The most striking aspect of the inequality
S that
it
universal in the sense it does not depend upon the nature of probability distribution of X.
a) Chebychev's Inequality
It Xis
a
random variable with mean and standard deviation o, then for any positive number k,
P(X l 2 ko) s 2 f(x)
Proof:Let
Case
the probability density function of X be fx(x).
(i): Let Xbe a continuous random variable.
Cefinition, E[X E(X) =
E(X - O ka +ko

wihere Fig. 9.5

the probability density function ofX.


Random Signal Analysis
(9-7) equences, Bou ds ondLy

-(x-u t)dk+ ((x)dx + (x.


- 1n

X2 ko and
For the first integral xSH- ko i.e. -
second integral + kosSx X - 2 ko.
Forthe

+ko

Butx(xdx= P(as Xs p)
2k'oP(-< Xs(u - ko))+P (4 + ko) s X < )

2ko|P(X-u)S-ko)+ P((X-H)2 ko)


2
k'oP(IX -H|2 ko)
PX-Hl2ko)
Case (ii): In the case of discrete random variable, the proof follows on the same line on replac
integration by summation.
Corollary 1. Since P{IX-H2 ko}+ P{X-ul< ko = 1
PX-le ko)=1- P{IX-ul2 ko}21-
Corollary 2. If ka= c (c> 0), PIx-Hl2s C
and
P{IX-u< c21-
or Pu-c)«X< (u +c}21-
(b) Generalised Form of
If
Chebychev's Inequality
gX)is a non-negative function of a random variable X, then for
every k> 0, WE get
y
Plg(x)2 k}s Elg(X)] ***********

We shall k
accept this theorem without
proof.
Example 1:Xis a random variable
on a day with mean 20 denoting the number of complaints received
and standard deviation
at a service-stal

will lie between 8 and 32. 2. Find the a mplarn

probability that on a day the numo


Solution: We have u =
20 and o =
2.

Now, x-ul= ko i.e. H=k


d o mS i g n a lA n a l y s i s (9-8) Sequences, Bounds and L. T.

32,
k= 32-20=6; when x =8, k=18-20
82-20

6
2 2
Whenx=
orollary (1), Ppage
9-7, we get,
By

PX-w< ko21- PIX-u<6o)2 1-2 36


5
H 20, o =2
P{X-20<12)2
36 P(8<X< 32) 235
I f X is the number shown up when an unbiased die is thrown. Show that
Ex mple
0-47. Also find actual probabilities.
-35 |2 2-5) <
X-6

(1) It Xdenotes the number obtained in a single throw of an unbiased die, its p.d.f. is given as
r
Solution:

olows.

|X=x 1 2 3 4 6
P(X =x) | 1/6 1/6 1/6 1/6 1/6 1/6

1+2+3+4+5+6)--3-5
6 2
. . 1 +2+ +n=(n+ 1)]

91
=(1+4+9+16+25+36) = . .

12+22 +=(n+1) (2n+ 1)1


91 49 35
=
V(X) =E(X*)-[E(X)F= A2
By corollary 2 (B), page 9-7 of Chebychev's inequality,

P(X-l2 c)
Onparing this with P(|X-3-5 |2 2:5), we see that since, = 3-5, C= 2-5 and o = 35/12.

P(X-3-5|2 2-5) s5/1 35047


4=0.47
(2-5) 12 25
X-3-5|22-5 means - (X-3-5) 2 2-5 or (X-3-5) 2 25 i.e., 35-2:5 > Xor X> 35+25, ie.
AOr X> 6 i.e., X lies, outside the interval (1, 6). The probability of this event is zero.

Ex
dmple 3: Two unbiased dice are thrown. If Xis the sum of the numbers shown up. prove that
35 Also find the actual probability.
P(1X-7|2 3)= 54
ion: () If X s the number of points obtained when a pair of an unbiased dice is thrown p.d.f. of
gven as denotes
in Example page 2-6.
EX)=2px 1
6363424+56+78393103611t312

6+12+ 20 36+ 30 22+ 12) = 252 7


36 + 30 + 42 + 40 + +
36
Random Signal Analysis (9-9) Sequenc
BOunds and
E(x) = XP,x

9+16 3s 25
49
4 36+3
2
+64+ 81+ 100+ 121+ 144
36 36
4 + 18+ 48+ 100+ 180+294 +320 +324+300 +242+ 144)
36
1974 329
36 6
o=V(X)=E(x?)-[E(X) = SZS-49 = 35

By corollary 2 (B) of Chebychev's inequality, P(X-4l2 c) s


Comparing this with P(1 X-7|2 3), we see that since, u = 7, c=3 and o2 =35/6

P(X-7123)s 9
S54
35
(ii) Now, | X-7|23 means-(X-7) 23 or X-723 i.e. (7-3) 2 Xor X27+3 i.e. 42 Xand X21
P(1X-7|2 3) = P(Xs 4) + P(X2 10)
=
P(X 2)+ P(X 3) + P(X 4) + P(X
= = = =
10) + P(X =
11) + P(X 12)
=

PX-7|23)- 2
Note..
Tofind, (i) S= 1+2a+ 3a +4a..
(i) S= 1 + 4a+9a + 16a+
() Multiply Sby a, S = 1+2a+3a+4a
+...
aS= a+2a+ 3 a +
By subtraction,

(1-a)S 1+a+ a
+a+. (G.P.) =- S=(1-a)
(i) Multiply Sby -3a, 3a and- a.
S=1+4a+9af +16a +25at
+.
-3aS= -3a-12a 27a
48a-. -

3aS 3a +12a+ 27a +.


-as= -a- 4a+.
By addition, (1-3a+3a a*) S=1+a -

(1-a)S= 1+a S= (1+


These results are often
a) (1- a)s
required and as such should be
committed to
memory.
1 n d o mS i g n a lA n a l y s i s
(9-10) Sequences, Bounds and L. T.
screte random variate X has probability 1/7 of assuming each of the values
Example4
P(IX-4 23)
and compare it with 1, 2,
Chebychev's inequality.
P(X):= 1/7 for x =1, 2,3, 7.
Wehave
. .

Solution

=(1+2.. +7)= 28-4

EX-pa-))
E(x) +2*+.+7)={ 140-20
VX)= EX)-[EXf = 20 16 4
Now X-4123 1.e.-(X-4) 23 or (X-4) 23 i.e. 4-32 X or X24+3
ie. 12X or X 27 i.e. Xs1 or27.

P(XS1or X2 7) =P(X =
1)+ P(X =7) =

P(X-4|2 3) 2/7.
Now by Chebychev's inequality, P(IX-4|2c)
Comparingthis with P(1 X-u|23) since, u = 4, we have c = 3 and since o2 = 4, by Chebychev's
eguality
P(X-4123)
EXample 5: If Xis a random variate such that E (X) = 3 and E (X*) = 13 use Chebychev's inequality
9eemine the lower bound for P(-2<X< 8).
9olution:We have =E (X) =3
=V(X) =E(x)-[E(X)P =
13-9= 4

By Chebychev's inequality, P(X-ulec)21-


Compari this with P((X-3
3|<c) i.e. -(X-3) <c or (X-3) <c
Le
3-c)< X< (3 + c) with P (-2<X<8) we see that c= 5
21
P-2 X<8)21- P - 2 X< 8) 225
tle ample
Xhas
6: Use Chebychev's inequality to find the lower limit of P (-2< X< 22) ifthe random
mean = 10 and variance = 9.
tion:By corollary of Chebychev's inequality
P(4-c)< X< (u +c) 21-
Since, u- X< (u + c) with -2 < X< 22, we see that. c= 12.
Comparing (u c)<

P-2«X<22) 21- 15 P-2<X< 22) 2


(9-11) nces, Bounds and
Random Signal Analysis
Find the lower
bound for the Drohal

thrown 600 times. ility of


getti
Example 7:A fair die is
su
between 80 and 120.
5 Here, n 600.
Solution: P(gettinga six) = 9g=
This is a Binomial distribution

=np=600.100 af =npg-600
P((-c)< X< (u c)) 21-
+
Comparing P (80< X< 120) with

Since, = 100. c =20


We get, P(80«X< 120)21- 6 400
2 924
= 1 24

Exercise E
1. A random variable X has the probability density function given by

- x20
otherwise
() Find P(|X-1|>2).
(i) Use Chebychev'sinequality to obtain the upperbound of P(| X-1 |>2).
Ans.: ()e (i)1
2. A random variable X has the probability density function given by

f)-2<x<
f)=23 V5 <x<

0, elsewhere
Find 3
P|1X-ul2oand compare it with the upper bound obtained by using Chebychevsn
3. A random variate has the
[Ans. (i) 1-3/2 (
probability density function as

rM= 1-SxS1+
o else where
Find
P|XH|2 and compare it with the upper bound obtained nequalty

by Chebycu
4. Acontinuous random variable is such that E () = 4 and V(X) = 4.TAns.:
Using () 1-V3/2, (
7equa

obtain the upper bound for (0)


continuous random variable P((X-4|22),
1,
5. A (i) P( X- |)21. I
Ans. : ()
is such
that E (X) 7 and E
(What is the least value of P (3 s Xs 11)?
=
(X*) 53.
=

() What is the greatest


value of P(|X-7|>3) ?
RaNdom

-12)
(ii) What
is the least value of P(X-7|< 5)? Sequenc Boundsand T.
L.
is the value of k that
(iv) What guarantees P( X-7 |s k) 2
0-96?
Prove
n0
that in 200 throws with a fair coin the Ans.:) 0-75, (i) O-25, (ii)
.
and 11000 is at
least 19/20. probability that the 21/25, (iv)
number of heads 2//96
coin istossed.400 times. lies between
Anunbiased 900
Obtain the lower
heads lies between 180 and 220. bound to the
Poisson
probability that the number of
6Let
a
distribution with parameter 100.
lowerbound of P (75 < X< 125) Use
Chebychev's Ans.:3/41
inequality to determine the
Xbe abinomial variate with n
e Xbe a binomial
Let
Chebychev's inequality find Ans.: 21/25]
=
500, p =1/5,
toP(80< X< 120). using
the lower bound
10 many trials must be made of an event
with binomial [Ans.:4/5]
that the relative probability
order that the probability will be atleast 0-9 of success 0-5 in each trial in
and 0-52 ? frequency of success will lie
between 0-48
[ Ans. 6250]
ic)Markov'sInequality
In particular if we take g(X) | X|, =
then from the
inequality (C), page 9.7, we get
P[IX2 k]s E|XI]
K
This is known as Markov's inequality.
Thus, Markov's inequality gives upperbound for the
andom variable is probability that a non-negative function of a
greater than or equal to some positive constant.
Example 1:The mean height of students in the class of F.E. is 5 feet 5 inches. Find the bound on the
-0ability that a student selected at random from the class is taller than 8 feet.
uuon If H denotes the
height of a students in the class then we are given that E (H) = 65 inches.
Also we are
given that k = 96 inches.
y Miarkov inequality the upper bound of the required probability is given by

P[IXl2 k]s EIX)


k

Puting the values P[H296s0-68.


96
with
a week is a random variable
"kanExample 2: Suppos the number of items produced in will
factory in a
Suppose
300. What i probability that this week's production be more than 600 ?

ion:By Markov's inequality


P(X2 600)00S E(X)300
600600 60022
income of a
that the
what is the probability
Exan the average incomne of a population is m,
"on is
wadion: grater
He than
Here
5 times average income?
k 5m. Then by Markov's inequality
m
P(X2 k)s E(X) P(X2 5m) m5
k
(9-13)
Sequences Bounds ond
Random Signal Analysls

continuous
random variable th n
(d) Chernoff's Inequality a

Random
Variable: Let Xbe
Continuous

(i) My(t)
P(X2 a)S ea constantnt
Xand a is an arbitrary
function of
moment
generating
where Mx(t) is the Chernoff's
inequality.
known a s we have
This inequality
is
density
function of Xthen,
the probability
Proof: Let fx) be
u(x-a)c
)
P(X2 a)= x )dx =
function.
where u is the unit-step sketches that
be clear from
some

real value t> 0, it can


Now, for any *********

elx-a)2 u(x-a)
From (1) and (2), we get
******s**eres

P(X2 a)s )ella-® d


4-23]
But we know that [ See (3) S 8,
page

dk
**o***stettosRse

lx).ela- =
M,()
of flx) about a.
i.e., the moment generating function
But (4) can be written as

M()= a" ) e dx =ea Mx() uueerpssssetoas

M)=ea (Moment generating function about the origin) ]


Hence, from (3), (4) and (5), we get

P(X2 a) sealMxt

(ii) Discrete Random Variable: Let Xbe a discrete random variable taking 1,2
probabilities P (X =i) Px(i)
=

Now, for any integers n, kwe define

n2k
u(n-k)
u(n-)lo, otherwise
=

Now, we have

P(X2k) Pxln)= 2 Px{n) u(n


n-k
n=0
-

k)
But n-2 u(n- k) for t20
(as seen
But
P(X2k)sX Pxln) e'n-k) serlk yPx(n)eearlier)
Px()e= Mx(t)
Hence, we get See (5), page 4-24]
P(X2 k) s e Mx(t)
m S I g n a lA n a l y s i s
(9-14) Sequences, Bounds and L. T.
Rondom

h e r n o f f ' sB o u n d s

aqLality P (X2 a) s e Mx(t) gives an inequality in terms of moment generating


Chemoich is a function of t. By minimising the right hand side, we can obtain Chernoff's
on Mxdne in usual way by differentiating r.h.s. with respect to t and by putting it equal to zero.
done

1hisis
Nounds.

Example1:If
4 . If Xis a normal random variable with mean m and standard deviation o, find Chernoff's
where a> m.
at
AWndst hat
P(X2 a)

Solution:We
now that if Xis a normal variate with mean m and standard deviation o, then

Mx()=emt +l'a*/2 See (d), page 6-13]


By Chernoff's inequality
P(X2 a) s ea" Mt)

S e a emt o/2
+

(a-m)+ Po/2
(1)
el(a-m) + Po/2
Now, let y =

-t(a-m) + to/2
.[-(a- m) +to]= 0
-(a-m)+tof =0 t-2-m
Puting this value of t in (1), the Chernoff's bound is

P(X2 a)s ela-m(a-m/o +[(a-m) /20 ]


s e a - m ) lo* + [la-m)?/20]

Sea-m)/202

Ckample 2: If Xis a Poisson random variate with parameter m, (m> 0) find Chernoff's bound for
y2K where k> m.
ution: By Chernoff's
inequality
P(X2 a) s
e"Mt)
e
My(0) the moment
generating function about the origin.
ut weKnOw that the moment
generating function of Poisson distribution about the origin is
Mx(t)= em(e'-1) [ See (A), page 5-201

P(X2k) s ek.mle-9
Se eme'- kt
***"***********
(1)
,let y= eme'- kt

=me-M[me' -k]
dt

dy 0 me' - k =0 -
dt m
stingthis value of
tin (1), we get
X2k)sem.gk-klog (k/m)
2e4eeaece*

(2)
Random Signal Analysls (9-15) Sequences, Bo ds ond
If we know k and m, we can calculate this probability.
For example, if the Poisson distribution is given by

P(X=x)= 2
x!
and if k= 5, putting m=2 and k=5 in (2), we get
P(X25)s e2-5log(5/2) s 02
Example 3: Compute Chernoff's bound for P (X 2 a) where X is a random
exponential law with parameter A. variate which fo
variats

Solution : We know that if X is an


exponential random variate with parameter ,
then
Mx()- See (c), page 6-6]
By Chernoff's inequality,
P(X2 a) s ea Mt)

-t
Now, let y =erat .

-a-e-"E)|--Aeraa-)-1 ( 12 L (-1 J
And=0 when a (-t)-1 =0
. a-1 =at
a
Putting this value of t in (1), we get
P(X2a) s ela-1).
-(a 1)/a]
Se(a-1).
1/a
s eaA-1) .an
In
particular if à = 2 and a =
2, then
P(X22) s ad.4s 0-199.
4. The Laws of
We shall now
Large Numbers
law is known as
consider two important lawsin
weak law probability theory known as laws o
? numbers

of large numbers and the


(a) The Weak Law of other is known as d g en u m b e r

Large Numbers ofstrong la


Let X, Xp. be a
sequence of
mean E (X;) =
u. independent and identically distributed random variable
with

ranao
nal Analysi
R a n d o mS i g n
(9-16) Sequences, Bounds and L.T.

X+ X2 t On. Then for e> 0,


Let X
lim P(IX 4l< E)== 1
n-
some authors state the above law as
lternatively

lim P(IX -H|> E) = 01 ... (A)

words i states that if we take large samples of a


In simple
meanwill be close to the population
ofn, the sample
fredvalueof
The Fig. 9.6 shows how the sequence
mean
with highpro
means oaches the population mean as n the size
ofsample
approaches infinity.
the sample
The following theorem known as the strong law of large
umbers is supposed by some statisticians as probably the most
lknown result in probability theory. In simple words, it states n
nat the mean of sequence of independent but identically
stributed random variables converges to the mean of that Fig. 9.6
stibution with probability 1.

) The Strong Law of Large Numbers


LetX, 2,.. be a sequence of independent and identically distributed random variables with finite
mean E(X) =H, then

PlimX= u=1
L

Alternatively some authors states the above low as

lim
Ln 1XH|> e=0 ... (B)

e diference between (A) and (B). In weak law of large numbers (A), we take the limit of the
e s as n>o
and in the strong law of large number (B) consider the
we
probability of the limit.

Central Limit Theorem (M.U. 2009, 10, 11)


Central limit theorem
em is a very important theorem in Statistical analysis. We give below the central
eorem in two
forms, one known as Liapounoff's Form and other known as Lindberg-Levy form.
ttal Limit
Theorem (Liapounoff's Form)
Ser2 Xn are independent random variates with E (X) = 4; and Var (X) = of, i= 1, 2,.
certain general con
Lof ntends onditions S, = X1 + X2++Xnis a normal variate with u = E
to
nfinity. (meaning
and variance
n is large.)
paricula
1em is kno form of the above theorem is of interest to us. The following form of the central limit
as Lundeberg-Levy theorem.
Nral
Limit
heorem em (Lindberg-Levy Theorem)
WX,Xa r e independently and ldentically distributed random variates such that E(X) =u
X)=,i= 1,2.
%n tends to n then S, X1 = + X2 + . . . + X, is a normal variate with mea nu and variance

infinity.
(9-17) Sequences Bounds ang
Random Signal Analysis falla.
very
important corollary as
follows.
we get a
having the maa
theorem,
above
Corollary:
From the
taken from
a population
and varian
mean of the sample of size n,
"If the (na) =
and Var(X)
=

Än then E(X)="=H n n

ie. of X=X1 * Az t. n
,

corollary of Central l
Central Limit Theorem,
imt
result a s a
important
following
In other words,
we get the drawn from the population with with mean
m
u and
of the sample
of size n
standard deviation o//Gia starrta
If X is the mean
and
with meanp
X is normally
distributed
deviation a then
is a S.N.V. as n -o,

Theorem
Central Limit
Significance of statisticians as the most important thenso
The central limit theorem
is regarded by some
and practically. Its sianife-
in probability theory. It is important both theoretically
number one place) lies its power.
in the fact that it is applicable for any distribution of X;s. Here,
lies
Since this convergence works t-
The central limit theorem deals
with convergence in distribution.
the central limit theorem.
near the centre (near the mean) it is called

electric bulb may be considered a random variabie


Example 1:The lifetime of a certain brand of
mean 1200 hours and standard deviation 250 hours. Using the Central limit theorem find the probz
that the average lifetime of 60 bulbs exceeds 1250 hours. (M.U.2
Solution: If X denoles the average (mean) lifetime of 60 bulbs then by the Central Limit Theoen

o/Vn
is a S.N.V. with 4 = 1200, 250 and n 60.

If X= 1250, .'.
7- 1250-1200 = 1-55
250/V60
P(Z> 1-55)= area to the right of 1-55
155
05-area between z = 0 and 0
z = 1-55
= 0-5-0-4394 0-0606 Fig.9.7

Example 2:The life-time of a transistor is andstar


random variate with mean
a
deviation 300 hours. Using C.L.T. find the 2500 etors is
than 2550 hours. probability that the average life-time of 50 ra
Solution: If X
denotes the average life-time of 50 transistor
then by Central Limit Theorem
(Cor. 1), X is normally
with mean u = 2550 and standard distributed
deviation a/n =
300/50.
Z - X-2500
o/n 300/50s a s.N.V.
When X = 2550,
2550-2500 50 V50 1-18
300/50 300 1:18

Fig. 9.8
(9-18)
P(X >2500) P (Z> 1-18
quences, Boun ds and L.
T.
Area to the right of Z=1.18
= 0-5 Area from Z=0
to Z=
05 118
=
03810 01190 =

Example 3:A random sample of


size 100 is
vanianceis 00. Using central limit
mit theorem taken from a
samplewill
not differ from (CLT) with what population
= by more than 4 ?
60
probability can we whose
assert
mean is 60 and
that the mean the
Solution: If X denotes the mean of of the
100 transistor
then by Central
trbuted with mean =
50 and
standard deviation Limit Theorem (Cor. 1),
o//n =

20/V100. X is
normally
Z =X-60X-60
o/n
The mean will not differ
20/100 is a S.N.V.
from 60 by more
ehsolute difference of X -60 is than 4 means the
les than 4.
PlX 601« 4]= P-4< X
- 60<4]
=

P[60 4< X< 60 + 4]


=P(56<X <64)
When X56, Z=56-60 2 0
2 -2. =
2

When Fig. 9.9


X=64, Z=64 60
2
.PX 60|<4]= P(-2
=
<Z< 2)
(Area from Z= 2 to Z= 0)
-

=
2x Area from Z=0
+
(Area from Z= 0 to Z= 2)
to Z =2
2x0-4772 = 0-9544.
wth EAample 4: Let X, X2,
mean 2 and .., X100 be 100 independent and
variance 1/4. Find
P(190< X + X2t+ X100 identically distributed (id) random variables
Solution: We have i
<210).
E(X) 2, =
S.D. of X= 1/2,
=
o=

G
Sn= X1 +X2+ X100 then by Central Limit
+
n= 100.

standard deviation
..

Theorem, S,is normally distributed with mean


=
ovn.
=
nu

Z= Sn-n S-100 x2 S,-200 is a S.N.V.


avn (1/2) /100 5
When Sn =
190, Z=. 190-200=-2;5
en
S, =210, Z= 210-200 = 2
5
P(190 S, <210) P(-2<Z< 2)=

=
(Area from Z = 0 to Z= -

2) 2
+ (Area from Z = 0 to Z= 2)
Fig.9.10
Random Signal Analysis (9-19) quences, Boune
2 x Area from Z=0 to Z= 2
2 x 0-4772 = 0-9544.

Note....
Note carefully the difference between the examples 1 to 3 and the examples 4 to D
X is the variate and in the second group Sn is the variate. the irst qg
Example 5: The lite-time of a certain make of lamp is distributed exponentiall,
Nit
30 hours. Find the probability that 144 of these lamps will give total light for more than 450e
tially
Solution: We have = E(X) = 30, =30 (For an exponential distribution, mean = Sn hours.
= u), n=1
If S= X + Xt+X144. then by CentralLimit Theorem, S, is normally distributed wilh.
and standard deviation = o vn. mean
Z SnDL_ S=144 x30 S -4320
is a S.N.V.
ovn 30 144 360
When S, 4500, Z = 4500-4320 0.5.
360
.
P(Sn>4500) P (Z> 0-5)
=

Area to the right of Z= 0:5


= 0-5 Area from Z= 0 to Z= 0:5 0-5

=
0-5 0 1915 0-3085. Fig.9.1

Example 6: The resistors 1. F2, 3, 4


interval (430 independent are and identically distributed uniformly in
530). Using the central limit theorem find P
Solution: We know that the uniform (1820sr + 2 +
rg+4 2020).
distribution is given by

(x)={b-a'
a<
X<b
with b
0, elsewhere
mean =

, and variance = -a
2 12
Since a =
430, b= 530.

Mean, = 960
= 480
2
and
Variance, o =b30430)10,000
12 833.33
12
=
83333
57.735. Also n
= 4.
If S +2+ a+ a, then by
Central Limit
normally distributed with mean nu and Theorem, Sq is
=
standard deviation o
Z Snn-Sh-(4x 480)S vn. =

n 57.735 -1920
57.735 Is a S.N.V.

When SnS, =
1820, Z =820-1920
Z=57.785
57.735 =-1.73
When S,=2020, Z=<UZ01920 1.73 - 1.73
1:73
57.735
Fig. 9.12
n d o mS i g n a lAnalysis (9-20) Sequences, Bounds and L. T.
2020) P-173 Z< 173) = «

. P(1820Sn<
= (Area from Z = - 1:73 to Z = 0)

+(Area from Z= 0 to Z= 1.73)


2 (Area from Z = 0 to Z= 1.73)
= 2 (0-4582) = 0-9164

o 7
Example 7 :: If X, X2, ..Xso are
and identically distributedindependent
random variates each
parameter with
0-03 and if S, m =
Poisson distribu
X1 + X2 +
Xso. find 3) using
=
+. P(S,2
imit theorem. [ Also
compare your value with the exact value of the
g
ral probability. ]
We have u = m =0-03.and o = m =0-03. (For Poisson distribution, mean variance
Solution:

Ther, n= 50.
= =
m).
#S.=Xi + X2 + + Kso» then S, Is normally distributed with mean =
nu = 50 x 0-03 = 1-5 and
d deviation = avn = /0.03 V50 = 1.5.

7-SD- Sn-1. is a S.N:V.


ovn 15
WhenS,=3, Z- = = /1.5= 1.22.
15 1.5
NoW, P(S,23) = P(Z2 122)
= Area to the right of Z=1-22 0 1-22
= 0-5 Area from Z= 0 to Z= 1-22
Fig.9.13
0-5 0388 01112.
Now, to find the exact value, let S, X1
= +
X2 + +
Xso
The sum of Poisson variates is also a Poisson variate.
But m
m2=.. ms0= 0-03 =

heparameter of Sn = m = 50 x 0-03 = 1-5.


Theprobability density function of
S, is given by
P(S)=e m es (1-5)n
S! S
.P(S,23) = 1 -

P(Sn <3)
1-[P(S, = 0) + P (S, =1) +P(S = 2)]
=1- . e * s (1.5)15(1.53]
O 2!
1 - e 15 ( 1+ 1.125]
1-5+
=
1 -

0-8087 = 0-1913 ]
Example 8
to estia A, X2.., Xn are Poisson variates with the parameter A = 2, use the central limit
aG
ion:We know Pthat(120< S< 160) where S, = X1 + X2t n = 75. +Xnand
m=2 ie. v2.
w thatfor
t
a Poisson variate X, E (X) = m=2 and
Var (X/) = o =

SnX+Xa+
n2 *.+Xn then by Central Limit Theorem S, is a normal variate with mean =
n u and
Random Signal Analysis
(9-21)
equences, Bou
S-75x2. Sn-150is
Sn is a S.N.V.
S.N.V.
Z= S n
2 75
120 150 = - 2.45
Z =
When S 120,=
V150

Z= 10016
=0.82

When S, = 160, V150


- 2-45
P(120« S,< 160) O 0.82
= P(-245 < z< 0-82) Fig. 9.14
245 to z 0-82
area from Z= -
=

= (area from Z = 0 toz= 2-45) + (area from Z= 0 to Z= 0.82)

0-4929 + 0:2939 = 07868

Example 9:A distribution has unknown mean and variance 1-5. Using central imitt
the size of the sample such that the probability that diference between sample mean and the
mean will be less than 0-5 is 0-95. popu
Solution:We have E (X) = u and Var (X;) = 1-5.

If Xisthe sample mean then Z= is a S.N.V.


o/Vn
We have X - u|= 0-5,

1I- - -04082
We know from then table that
P( Z|) > 0-95 when Z= 1-96.
0-4082 n 1-96
=1.96 Vn= 0.4082 n 23.05
Hence, n must be atleast 24.

1. A random sample of size 100


Exercise -IIE
Using the central limit is taken from a population
whose mean is 60 and
6 0 by more than 4. theorem, find the probability that the mean of the l notot
2. If X,
X2,
samp
TAns.:P(-2szs2)=
S X + X2 ++XnXare independent
using the central limit theorem evaluate
Poisson random variates P(S,23)To
the parameter à =
answer with the exact
each with tne pa pare
value of the
(Hint: Z=
probability.
When x= 3, Z=1:11. S is a 40 x 0-04
Poisson variate with m=

3. If X1, X2, Xn are


(i) 0-1335,(
estimate P(180 S, Poisson variates with
< the
TAns. : t h e o e

250) where <


parameter a =3, use tne centralA
li n s .: 0 9
4. Considering the S, = X1 +
X2t
standard deviationlife-span of +X, and n 70. =

20 years persons in a locality as a 80


yeas

life of 70 and using random variable


variable with
m e a n

ave
the
persons will exceed 85 ility tha
the

central limit that


years. theorem, find the pr
ability A n s .: 0 0 a
m SignalAnalysis
(9-22) Sequences, Bounds and L.T.
Random.

Exercise-I E
Theory

prove Chebychev's inequality. (M.U. 2012, 13)


Stateane

. State Markov's inequality.

State Chenoff's Inequality.

central limit
theorem. (M.U. 2011, 12)
State
Define stochastic c o n v e r g e n c e .

of Random Variables. (M.U. 2012)


:Describe Sequence
of random variable. (M.U. 2009, 10)
, Define a sequence
State Central
Limit Theorem and explain its significance. (M.U. 2009, 10)
law of large numbers. (M.U. 2009, 10)
State Strong

You might also like