Stochastic Processes and Their Applications 1st Edition Frank Beichelt (Author) All Chapters Instant Download
Stochastic Processes and Their Applications 1st Edition Frank Beichelt (Author) All Chapters Instant Download
Stochastic Processes and Their Applications 1st Edition Frank Beichelt (Author) All Chapters Instant Download
https://ebookgate.com/product/theory-and-statistical-applications-of-
stochastic-processes-1st-edition-yuliya-mishura/
ebookgate.com
https://ebookgate.com/product/probability-theory-and-stochastic-
processes-with-applications-2009th-edition-oliver-knill/
ebookgate.com
https://ebookgate.com/product/basic-stochastic-processes-1st-edition-
devolder/
ebookgate.com
Shore Processes and their Palaeoenvironmental Applications
1st Edition Edward J. Anthony (Eds.)
https://ebookgate.com/product/shore-processes-and-their-
palaeoenvironmental-applications-1st-edition-edward-j-anthony-eds/
ebookgate.com
https://ebookgate.com/product/large-deviations-for-stochastic-
processes-jin-feng/
ebookgate.com
https://ebookgate.com/product/adventures-in-stochastic-processes-1st-
edition-sidney-i-resnick-auth/
ebookgate.com
https://ebookgate.com/product/introduction-to-stochastic-processes-
second-edition-gregory-f-lawler/
ebookgate.com
https://ebookgate.com/product/discrete-stochastic-processes-and-
optimal-filtering-second-edition-jeanclaude-bertein/
ebookgate.com
Stochastic Processes and Their
Applications
Stochastic Processes and Their
Applications
This book contains information obtained from authentic and highly regarded sources. Reasonable efforts have been
made to publish reliable data and information, but the author and publisher cannot assume responsibility for the
validity of all materials or the consequences of their use. The authors and publishers have attempted to trace the
copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to
publish in this form has not been obtained. If any copyright material has not been acknowledged please write and let
us know so we may rectify in any future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or
utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including
photocopying, microfilming, and recording, or in any information storage or retrieval system, without written
permission from the publishers.
For permission to photocopy or use material electronically from this work, please access www. copyright.com
(http:/ /www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers,
MA 01923, 978-750-8400. CCC is a not-for-profit organization that provides licenses and registration for a variety of
users. For organizations that have been granted a photocopy license by the CCC, a separate system of payment has
been arranged.
Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for
identification and explanation without intent to infringe.
Titel de Originalausgabe: Beichelt, Frank: Stochastische Prozesse Fiir Ingcnicure, © B.G. Teubner, Stuttgart 1997.
Translation arranged with the approval of the publisher B.G. Tcubncr from the original German edition into English.
Contents
Preface ix
1 Probability Theory 1
1.1 Random Events and their Probabilities 1
1.2 Random Variables 4
1.2.1 Discrete Random Variables 4
1.2.2 Continuous Random Variables 6
1.2.3 Nonnegative Random Variables 8
1.3 Random Vectors 13
1.3.1 Two-Dimensional Random Vectors 13
1.3.2 «-Dimensional Random Vectors 20
1.4 Sums and Sequences of Random Variables 22
1.5 Transformations of Probability Distributions 31
1.5.1 z-Transformation 32
1.5.2 Laplace-Transformation 34
Exercises 37
2 Stochastic Processes 43
2.1 Introduction 43
2.2 Characteristics of Stochastic Processes 47
2.3 Properties of Stochastic Processes 49
2.4 Special StochasticProcesses 53
2.4.1 Continuous-Time Stochastic Processes 53
2.4.2 Stationary Discrete-Time Stochastic Processes 60
Exercises 67
3 Poisson Processes 69
3.1 Homogeneous Poisson Process 69
3.1.1 Definition and Properties 69
3.1.2 Poisson Process and Uniform Distribution 77
3.2 Inhomogeneous Poisson Process 85
3.2.1 Definition and Properties 85
3.2.2 Minimal Repair 89
Exercises 97
v
vi CONTENTS
4 Renewal Processes 99
4.1 Foundations 99
4.2 Renewal Function 102
4.2.1 Renewal Equations 102
4.2.2 Bounds for the Renewal Function 107
4.3 Recurrence Times 111
4.4 Asymptotic Behaviour 114
4.5 Stationary Renewal Processes 118
4.6 Alternating Renewal Processes 120
4.7 Cumulative Stochastic Processes 125
4.8 Regenerative Stochastic Processes 131
Exercises 134
This book is an introduction to stochastic processes and their applications for students in
engineering, industrial statistics, science, operations research, business and public policy
analysis, and finance. It provides theoretical foundations for modeling time-dependent
random phenomena as they occur in physics, electronics, computer science and biology.
Likewise, it takes into account applications in the field of operations research, in
particular in queueing, maintenance and reliability theory as well as in financial markets.
Through numerous, mostly science and engineering-based examples, the subject is
represented in a comprehensible, practically oriented way. Hence the book is also
suitable for self-study. As a non-measure theoretic introduction to stochastic processes,
its study only requires a knowledge of calculus and elementary probability theory which
should be familiar to students after having finished their undergraduate studies in
mathematics and statistics. However, to make the book attractive to mathematically
interested readers as well, some important proofs and theoretically challenging examples
and exercises have been included. Solutions to most of the exercises can be found in an
appendix or the exercises are given together with their solutions. Those sections,
examples or exercises marked with a * symbol are either theoretically more difficult or
of less practical impor- tance. The chapters are organized in such a way that reading a
chapter usually requires a knowledge of the previous ones.
A mathematically rigorous treatment of Wiener processes (Brownian motion processes)
and the spectral analysis of stationary processes, dealt with in chapters 7 and 8,
respectively, is not possible without a basic knowledge of measure theory, Fourier
analysis, and the theory of generalized functions. Therefore, these chapters sometimes
present heuristically motivated explanations and formulas instead of mathematically exact
concepts and derivations. This approach does not detract from the main purpose of this
book, which is to enable readers to apply stochastic modeling in their own field. This
book generally does not deal with the data analysis aspects of stochastic processes. It
can be anticipated that readers will use statistical software for tackling numerical
problems. However, after having studied our book they will be able to work more
creatively with such software and write their own analysis programs. The text may also
serve as a basis for preparing senior undergraduate and graduate level courses.
The authors wish to thank their Honours students of the year 2000, who checked and
tested the exercises and gave many valuable hints. In particular, we acknowledge the
contributions of H. Christoforou, S. Knight, T. Levin, V. Nkwambi and D. Tugendhaft.
Finally, the authors would like to thank Ms. Frances Horrocks and Mrs. Janie Wardle of
Taylor & Francis for their committed, constructive cooperation.
IX
X
The book is a thoroughly checked word for word translation of the German original
published by B. G. Teubner Stuttgart in 1997 under the title "Stochastische Prozesse fur
Ingenieure". The numerous suggestions of German-speaking readers helped very much
to prepare this English edition. Many thanks for their support as well. In this second
reprint, some corrections have been made.
Further helpful comments on this book are very welcome and should be sent to:
University of the Witwatersrand, School of Statistics and Actuarial Science, WITS 2050,
Johannesburg, Republic of South Africa. E-mail: [email protected].
Probability Theory
X Y Z random variables
E(X), Var{X) expected (mean) value, variance of X
fxto> Ρ Φ ) probability density function, (cumulative probability) distri
bution function o iX
λ(χ), Λ(χ) failure rate, integrated failure rate (hazard function)
Ν(μ,σ2) normal random variable with expected value μ and variance a 2
φ(χ), Φ(χ) probability density function, distribution function of a standard
normal random variable
fx(x l>x2· —>*«) joint probability density function of X = (X \,X 2...... Xn)
F x(x i ,x2, ... ,xn) joint distribution function of X = (A'j.A^,... ,Xn)
Cov(X, Y) covariance between X and Y
p(X, Y) correlation coefficient of X and Y
M(z) z-transform (moment generating function) of a discrete random
variable or its probability distribution
Stochastic Processes
{AXO, t e T}, {A',, i e T } continuous-time, discrete-time stochastic process
with parameter space T
Z state space of a stochastic process
ft(x), Ft(x) probability density, distribution function of X(t)
ftut2,...,tn(x l>x2>··· >*«)> Ft,,t2>. ^ ( x l>x2>··· >x") j ° int probability density
function, distribution function of (X(tf),X(/2) , ... ,X(tn))
m{t) trend function of a stochastic process
C{s,t) covariance function of a stochastic process
C(t) covariance function o f a stationary process
p(s,t) correlation function of a stochastic process
XI
XU SYMBOLS AND ABBREVIATIONS
The following concepts refer to the same random experiment. A possible outcome
a of the experiment is called an elementary (simple) event. The set of all elementa
ry events is called space o f elementary events or sample space. Here and in what
follows the sample space is denoted by M. A sample space is discrete if it is a fini
te or countably infinite su. A random event (briefly: event) A is a subset of M. An
event A is said to have occured if the outcome a of the random experiment is an
element of A, i.e., if a e A . Let A and B be two events. Then the set-theoretic
operations intersection 'W an d union "U" can be interpreted in the following way:
A r\B is the event that both A and B occur and A KJB is the event that A or B (or
both) occur. If A ε B, i.e. if A is a subset of B, then the occurence of A implies the
occurence of B. A\ B is the set of all those elementary events which are elements
of A, but not of B. Hence, A \ B is the event that A occurs but not B. The event
1
2 1 PROBABILITY THEORY
A = Μ \Λ is the complement o f A. Thus, if A occurs, then not A and vice versa. Let
A i ,Α ϊ , ...,An be a sequence of events. Then the rules of de Morgan hold:
( 1. 1)
In particular, if n = 2, A[ - A and A 2 = B,
( 1.2)
The empty set 0 is the impossible event, since, not containing any elementary
events, it can never occur. By definition, M contains all elementary events so that
it must always occur. Hence M is called the certain event. Two events A and B are
called mutually exclusive if their joint occurence is impossible, i.e. if A r\B = 0 .
In this case the occurence of A implies that B does not occur and vice versa. In
particular, A and A are mutually exclusive.
Let “
Mi be the set of all events which can occur when carrying out the random ex
periment. Further, let P = P(·) be a function on ")tt with the following properties:
I) P (0 ) = 0, P (M )= 1.
II) For any event A, 0 < P(A) < 1.
III) For any sequence of pairwise mutually exclusive events
(1.3)
The number P(A) is the probability of the event A. P(A) characterizes the degree
of certainty of the occurence of A. This interpretation of the probability is justified
by the following implications of the the properties I) to III).
1)
2) If
3) If A and B are mutually exclusive events, i.e. if A η B = 0 , then
Hint: It is assumed that all events which arise from applying the operations r>, U, £ and
\ to any subsets of 1ft are also elements of "ffi.
1.1 RANDOM EVENTS AND THEIR PROBABILITIES 3
The probabilites of random events are usually unknown. However, they can be
estimated by their relative frequencies. If, in a series of n repetitions of the same
random experiment the event A has been observed m = m(A) times, then the
relative frequency of A is given by
Therefore, the probability of A can be estimated with any required level of accura
cy from its relative frequency by sufficient repetitions of the random experiment.
Two events A and B are called independent if
0 .4 )
The events A i,A 2 ,...,An are independent if for any subset ,Λ,-^jof
(1.5)
Let Λ and B be two events with P(B) > 0. Then the conditional probability of A gi
ven B is
0 .6)
Let {A \,A 2 ,...,An} be an exhaustive and mutually exclusive set of events. Then
the formula
(1.7)
Thus, F(x) is the probability of the random event that X assumes a realization
which is less than or equal to x. For a <b,
( 1 -8 )
The range of a discrete random variable is a finite or countably infinite set. Exam
ples of discrete random variables were given in section 1.1 (examples 2 to 5).
Let {xo, X\, *2> "·} be the range o f X, x,· < x* for i < k. Further, letp, be the pro
bability of the random event that X assumes the realization x,·:
1.2 RANDOM VARIABLES 5
Conversely, any sequence of nonnegative numbers [pq,p | ,...} satisfying this con
dition can be considered to be the probability distribution of a discrete random va
riable.
The distribution function of A is
If the range of X is finite and if xn is the greatest realization of X, then this defini
tion has to be supplemented by
Thus, given {Po.Pi. ···} the distribution function of A-can be constructed and, vice
versa, given the distribution function of X, the probabilities p, = P(X = x,) can be
obtained.
The expected value (mean value) E(X) and the variance Var(X) of X are given by
and
Distribution Range o f X
uniform
distribution
geometric
distribution
binomial
distribution
negative bino
mial distribution
Poisson
distribution
In particular,
Conversely, every nonnegative function f(x) satisfying this condition can be con
sidered to be the probability density of a random variable X. As with its distribu
tion function, a continuous random variable is also completely characterized by its
probability density.
Expected value (mean value) E(X) and variance Var(X) of X are defined by
The n th moment of X is
This relationship also holds for discrete random variables. For a continuous ran
dom variable Ar the probability (1.8) can be written in the following form:
The range of X coincides with the set of all those x for which f(x) > 0.
Distribution Range o f X
Uniform distrib
ution over [c,d]
Exponential
distribution
Gamma
distribution
Beta distribution
in [0,1]
Erlang
distribution
Normal (Gauss-)
distribution
Comment The densities have the given functional forms over the ranges specified in the
second column. Elsewhere they are identically zero. The Gamma function Γ(χ) and Beta
function B(x,y) are defined by
8 1 PROBABILITY THEORY
If X is a nonnegative discrete random variable in the range {0, 1,...} with proba
bility distribution (p,· = P(X =i); i = 0,1,...}, then its expected value
( 1. 11)
Hence,
( 1. 12)
survival probability because F(x) and F(x) are the respective probabilities that the
system does or does not fail in [0, x\.
Let us now consider the distribution function of the residual lifetime of a system
which has already worked for t time units without failing (Figure 1). This condi
tional failure probability will be denoted by F<(x):
(1.13)
(1.14)
Example 1.1 (uniform distribution) Let the random variable X be uniformly dis
tributed over [0, T]. Then its density and distribution function are (see Table 1.2)
elsewhere,
Thus, the residual lifetime after the time point t is uniformly distributed over the
interval [0, T -t].
Figure 1.2 Density and distribution function of a random variable being uniformly
distributed over [0, T]
10 1 PROBABILITY THEORY
(1.15)
Thus, the residual lifetime of the system has the same distribution function as the
lifetime of a new system: it is exponentially distributed with parameter λ. The ex
ponential distribution is die only continuous probability distribution which has this
so-called memoryless property or lack of memory property. Consequently, the age
of a system with exponential lifetime has no influence on its future failure beha
viour. Or, equivalently, if the system has not failed in the interval [0, /], then, with
respect to its failure behaviour in [t, oo), it is "as good as new". Electronic hardwa
re often has this property after the "early failure time period".
The relationship (1.15) can also be written in the form
(1.16)
It can be shown that the exponential distribution is the only one which satisfies
(116). □
The practical background of the conditional failure probabilit motivates the follow
ing definition:
Definition 1.1 A system is aging in the interval [f j , /2], *i < / 2, if and only if for
an arbitrary, but fixed x and for increasing t, fj < f2 , the conditional failure
probability Ft(x) (conditional survival probability Ft(x)) is increasing (decrea
sing). ·
The following considerations provide another approach to modeling the aging be
haviour of systems: When the conditional failure probability Ft(At) of a system in
the interval [/, ί+ Δ /] is considered relative to the length At of this interval, one
obtains a conditional failure probability per unit time Ft(At)/At, that is, a "failure
probability rate". As At -> 0, this rate tends to a function λ(ί), which gives infor
mation on the instantaneous tendency of the system to fail:
1.2 RANDOM VARIABLES 11
(1.17)
Hence,
The function λ(ί) is called failure rate. The integrated failure rate or the hazard
function is given by
(1.19)
(1.18) implies an important property of the failure rate:
A system ages in the interval [t \ , tf\, t\ < tj, i f and only if itsfailure rate is
I increasing in this interval.
Example 1.3 ( WeibuU distribution) A random variable Λ' has a Weibull distribu
tion with parameters β and Θ if it has density
By differentiation,
This result shows once more that Θ is a scale parameter. Special cases of the Wei
bull distribution are the exponential distribution (β = 1) and the Rayleigh distribu
tion (β = 2). Analogously to the exponential distribution, the distribution function
of a Weibull distributed random variable is sometimes written in the form
For many applications, the following property of the failure rate λ(χ) is important:
( 1.20)
o(h) is Landau's order symbol with respect to h -» 0, i.e. any function of h satis
fying
(see Appendix 1). Therefore, for Δχ sufficiently small, λ(χ)Δχ is approximately the
probability of a system failure in (x, χ+Δχ] provided that it has operated in [0, x]
without failing. This property of the failure rate can be used for statistical estima
tion: At time t = 0 a specified number of systems with independent, identically
distributed lifetimes start working. Then the failure rate of these systems in the in
terval (x. χ+Δχ] is approximately equal to the number of systems having failed in
(x, χ+Δχ] divided by the number of systems which are still operating at time x.
1.3 RANDOM VECTORS 13
The set of probabilities {r^·; i,j = 0,1,...} is the joint or two-dimensional proba
bility distribution of the random vector (X\ Y). The individual probability distribu
tions of X and Y are referred to as the marginal distributions of the joint distribu
tion of (X, Y). The following relationship holds between the joint probability dis
tribution and its marginal distributions:
( 1.21)
The sets
are the conditional distributions o f X given Y=yj and o f Y given X = xit respec-
tively. Hence, the corresponding conditional expected values are
( 1.22)
(1.23)
i.e. X and Y are independent if the random events "X = x " and "Y=yj are inde·
pendent for all i,j = 0,1,... (see section 1.1).
If X and T are independent, then, for all i,j = 0,1,...
and
The joint distributionJunction of the random vector (X, Y) is defined by the proba
bility
as a function of x andy; x,y e (-oo,+oo). The joint distribution function of the ran
dom vector (X Y ) characterizes its joint or two-dimensional probability distribu
tion. (In case o f discrete components the joint distribution function is, of course,
defined in the same way.) F ^ y(x,y) has properties
1)
2)
3) (1.25)
4)
Conversely, any function of two variable which has these properties is the joint
distribution function of a random vector (X, K).
Assuming its existence, the second partial derivative of F(x,y) with respect to x
and y,
is called the joint probability density of (X, Y). The joint density can equivalently
be defined by
(1.26)
Conversely, any nonnegative function of two variables x and y satisfying this con
dition can be considered to be the joint density of a random vector (X, Y).
The probability that the random vector (X, Y) assumes a realization in the region B
of the (x,y)-plane is given by the area integral
(1.27)
Putting y = +oo and x = +oo, respectively, one obtains the marginal distribution
fiinctions of Fx y(x,y)
Thus, the marginal distribution functions belonging to Ρχ y{x,y) are simply the
distribution functions of X and Y, respectively. Similarly, the densities of X and Y
are the marginal densities belonging to fxty(x,y). In view of (1.26),
(1.28)
Two random variables X and Y with joint distribution function Pxty{x,y) are inde
pendent if, for all x and y,
or, equivalently,
If the joint density fxty(x,y) of {X, Y) exists, then the independence of X and Y is
equivalent to fx ty(x,y) being the product of the marginal densities:
(1.29)
1.3 RANDOM VECTORS 17
Hence,
Thus, Fx(x) can be interpreted as the expected value of the conditional distribu
tion function of X given Y:
(1.30)
The conditional expected value o f Xgiven Y - y is
In view of (1.29),
(1.31)
The expected values of the sum and of the product of X and Y are given by
Taking into account (1.28) one obtains the same formulas as in case of discrete
components:
0.32)
and for independent A'and Y
(1.33)
In particular,
From (1.33) it follows that if X and Y are independent, then Cov(X, f) = 0. But if
Cov(X, Y) = 0, then X and Y are not necessarily independent.
18 1 PROBABILITY THEORY
(1.34)
Thus, if X and Y are independent, then they are uncorrelated. But if X and Y are
uncorrelated, they need not be independent.
Example 1.4 (bivariate normal distribution) The random vector (X, Y) is bivar
iate normal distributed with parameters
Px, Py, σχ , cy and p; -oo < μ*,μ^, < oo, σ* > 0, > 0, - 1 < ρ < 1,
if it has joint density (Figure 1.4)
Corollary If (X, F) is bivariate normally distributed with parameters μ*, σχ, μ^,
ay , and p, then X and Y are normally distributed with parameters μ*,σ* and μ^,
ay ; respectively. X and F are independent if and only if p = 0. (Note that the in
dependence of X and Y is equivalent to fxyy(x,y) =.ίχ(χ)ίγ(γ) ·)
1.3 RANDOM VECTORS 19
It can be shown that the parameter p is equal to the correlation coefficient between
A'and T(p = p(X,f))). Therefore,
I f the random vector (X, Y)has a bivariate normal distribution, then X and Y
I are independent i f and only i f they are uncorrelated.
The parameters of this distribution are the conditional expected value of X given
that Y - y and the conditional variance of X given that Y = y:
O f course, in these formulas the roles of X and Y can be changed. Sums of normal
ly distributed random variables are considered in section 1.4. □
20 PROBABILITY THEORY
Let X \,X 2 , ... ,Xn be continuous random variables with distribution functions
^*,(*1 )>Ρχ2(χ2), ...,FXn(xn) and densities /> ,(* i ), f x 2(xi), ■■■Jx„ (*«)· The
joint distributionfiinction of the random vector X = (X], X2, ... ,Xn) is
Provided its existence, the n th mixed partial derivative of the joint distribution
function with respect to the x \ , x2, ... ,xn:
is called the joint {probability) density of the random vector X. The functions
Fx (xh x 2, ... ,x n) and f\( x i,x 2, ... , X n ) are also called the n-dimen- sional distri
bution junction and n-dimensional (probability) density, respectively, of X. They
determine the n-dimensional probability distribution of X. The characteristic pro
perties of two-dimensional distribution functions and probability densities can be
extended in a straightforward way to n-dimensional distribution functions and den
sities. Hence they will not be given here.
The distribution functions and the densities of the Y, can be obtained from the jo
int distribution function and density, respectively, analogously to the two-dimen
sional case:
(1.35)
In this formula,
In view of (1.35) and (1.36), for independent Xj this «-dimensional integral sim
plifies to
(1.37)
Let
be the covariance and the correlation coefficient between Xt and Xj, respectively.
(Note that σ,·,· = VatiX,) and p,·,· = 1.) It is useful to combine these parameters in
the covariance matrix Σ and in the correlation matrix p, respectively:
□
Theorem 1.1 Let the random vector (Yj ,Y2, .... Y«) be n-dimensionally normally
distributed and let the random variables Y\t Τ2,..., Ym be linear combinations of
theY,:
Expected value of a sum The expected value of the sum X\ +X2 + ··· + Xn
is defined by
From (1.35),
Hence,
(1.38)
The expected value o f the sum o f any random variables is equal to the sum
I o f the expected values o f these random variables.
Using formula (1.32), this fact can be more easily proved by induction. In view of
(1.23), formula (1.38) is also valid for discrete random variables Λ”, .
Since
(1.39)
(1.40)
The random variables X \,X i, ... are said to be identically distributed as X if all of
them have the same probability distribution as X. From a probabilistic point view,
there is no difference between identically distributed random variables. In the case
of independent, identically as X distributed random variables, formulas (1.38) and
(1.39) simplify to
Then,
(1.41)
Integration yields the corresponding formula for the distribution function Fz(z) of Z:
0.43)
1.4 SUMS AND SEQUENCES OF RANDOM VARIABLES 25
dF(x\
Since fix) = —— ^ or dF(x) =f(x)dxythis relationship can be written in the form
(1.44)
(1.45)
The integrals in (1.42) and (1.43) are called convolutions of the densities f% and fy
and the distribution functions F x and Fy, respectively. Hence, analogously to dis
crete random variables, the following statement is valid:
(1.46)
(1.47)
26 PROBABILITY THEORY
(1.48)
(1.49)
Example 1.6 (Erlang distribution) Let the independent random variables X\ and
X 2 be exponentially distributed with parameters λ i and λ 2 :
If λ 1 = λ 2 = λ , then
where f(x) = λβ **, x > 0. On condition that this assumption is true, the density of
(1.44)
Thus,
1.4 SUMS AND SEQUENCES OF RANDOM VARIABLES 27
But this is the density of an Erlang distributed random variable with parameters n
and λ. The corresponding distribution function is
(1.50)
□
Example 1.7 {Normal distribution) Let the random variables Xt be independent
2
and normally distributed with parameters μ,- and <sj; i = 1, 2 :
and substituting
28 PROBABILITY TH EORY
yields
By theorem 1.1, a sum o f norm ally distributed random variables is also norm ally
distributed. For independent random variables, example 1.7 yields, by induction, a
sharpening o f this statement:
Corollary Let Z - X \ +X 2 + ... +Xn be the sum o f independent random variables
X(= Ν ( μ ,,σ 2); 1 = l,2 ,...,n . Then,
(1.51)
1.4 SUMS AND SEQUENCES OF RANDOM VARIABLES 29
Theorem 1.2 {Central limit theorem) Let X\ ,Χ ι , ... be an infinite sequence of in
dependent random variables with finite expected values E{Xj) = μ and finite var
iances VaiiXj) = σ 2 ; i = l , 2 , .... Furthermore, le t Zn =X\ +X2 + ·" +Xn and
Then,
■
Theorem 1.3 deals with the sum of a random number of random variables. To state
this theorem, another important concept has to be introduced:
Comment A random event A is independent of a random variable X if for all x the ran
dom events .4 and ”X i x" are independent.
Intuitively, the notation "stopping time" can be motivated by assuming that the
random variables Xj are observed in turn. For N - n , this process is stopped after
having observed X\,Xi,...,Xn, i.e., before observing ArM+j,AT„+2 ,...
Example 1.8 Let Xj = 1if after the rth tossing of a coin a "head" is observed and
Xi = 0 otherwise. Further, let P{Xj = 1) = P{Xt = 0) = 0.5 for all i - 0 ,1 ,... Then
Theorem 1.3 {Wald's equation) Let X \,X 2, ... be a sequence of independent ran
dom variables and let N be a stopping time for this sequence. Assuming the Λ',· to
be identically distributed as X and the expected values E{X) and E(N) to be finite,
then
(1.52)
30 PROBABILITY THEORY
F, = 1 holds if and only if no stopping has occured after observing the random va
riables X \,X 2, ...,Y ,_i. Since N is a stopping time, F, is independent of the ran
dom variables X{, X i+\ ,... In particular, F,· is independent of Xt so that
The left hand side of this equation is equal to 3. The right hand side contains the
factor E(X) = 0. This situation requires the assumption E(N) = oo. Therefore, in
this case Wald's equation is not applicable.
1.4 SUMS AND SEQUENCES OF RANDOM VARIABLES 31
1.5.1 z-Transformation
Let X be λ discrete random variable with range {0,1,2,...} and probability distri
bution
Definition 1.3 The z-transform of the random variable X and its probability dis
tribution {po>p i ,/>2 >—} >respectively, is defined as the infinite series
(1.53)
M(z) converges absolutely for |z| < 1:
Letting z = 1 yields
Letting z = 1 yields
Title: Salvage
Language: English
Updated editions will replace the previous one—the old editions will
be renamed.
1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside the
United States, check the laws of your country in addition to the
terms of this agreement before downloading, copying, displaying,
performing, distributing or creating derivative works based on this
work or any other Project Gutenberg™ work. The Foundation makes
no representations concerning the copyright status of any work in
any country other than the United States.
1.E.6. You may convert to and distribute this work in any binary,
compressed, marked up, nonproprietary or proprietary form,
including any word processing or hypertext form. However, if you
provide access to or distribute copies of a Project Gutenberg™ work
in a format other than “Plain Vanilla ASCII” or other format used in
the official version posted on the official Project Gutenberg™ website
(www.gutenberg.org), you must, at no additional cost, fee or
expense to the user, provide a copy, a means of exporting a copy, or
a means of obtaining a copy upon request, of the work in its original
“Plain Vanilla ASCII” or other form. Any alternate format must
include the full Project Gutenberg™ License as specified in
paragraph 1.E.1.
• You pay a royalty fee of 20% of the gross profits you derive
from the use of Project Gutenberg™ works calculated using the
method you already use to calculate your applicable taxes. The
fee is owed to the owner of the Project Gutenberg™ trademark,
but he has agreed to donate royalties under this paragraph to
the Project Gutenberg Literary Archive Foundation. Royalty
payments must be paid within 60 days following each date on
which you prepare (or are legally required to prepare) your
periodic tax returns. Royalty payments should be clearly marked
as such and sent to the Project Gutenberg Literary Archive
Foundation at the address specified in Section 4, “Information
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
ebookgate.com