0% found this document useful (0 votes)
36 views6 pages

Lecture 17: Return and First Passage On A Lattice

1. The document discusses return probability and first passage times for random walks on lattices. It introduces generating functions to encode discrete probability distributions and describes how they relate to Fourier and Laplace transforms. 2. An example of a biased random walk on the integers is used to illustrate how to calculate the probability of return using generating functions. 3. The reflection principle is introduced as a way to relate the number of random walk paths between two points to the number crossing a given line.

Uploaded by

Lameune
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views6 pages

Lecture 17: Return and First Passage On A Lattice

1. The document discusses return probability and first passage times for random walks on lattices. It introduces generating functions to encode discrete probability distributions and describes how they relate to Fourier and Laplace transforms. 2. An example of a biased random walk on the integers is used to illustrate how to calculate the probability of return using generating functions. 3. The reflection principle is introduced as a way to relate the number of random walk paths between two points to the number crossing a given line.

Uploaded by

Lameune
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

1

Lecture17: ReturnandFirstPassageonaLattice

Scribe: Kenny Kamrin (and MartinZ. Bazant)


Department of Mathematics, MIT
April 7, 2005
Return Probability on the Integer Lattice (d= 1)
To begin, we consider a basic example of a discrete rst passage process. Consider an unbiased
Bernoulliwalkontheintegersstartingattheorigin. Wewishtodeterminetheprobability,R,that
the walker eventually returnsto 0, regardless of the numberof steps it takes.
Without loss of generality, say the walkers rst step is to 1. From here, the walker can either
step left to 0 or can return to 1 n times before stepping back to 0. To avoid double counting, we
must also assert that if the walker comes back to 1 n times, it cannot step left to 0 any of those
times except the last. Since half of the possible paths from 1 back to 1 stay to the right of 1,
the probability of returning to 1 exactly once before hitting 0 is R/2. Likewise, the probability of
returningto 1 n times before stepping back to 0 is (R/2)
n
. Thus:

1 1 1 1
R = + (R/2)
n
= (R/2)
n
= = .
2 2 2 2(1R/2) 2R
n=1 n=0
This means,
R
2
2R+1=0 = (R1)
2
=0
and since probabilities cannot be negative, this means R=1.
In essence, weve just deduced that a drunk man who aimlessly wanders away from a pub will
eventuallyreturn,aslongasheisconnedtoalongnarrowstreet. Basedonthecontinuumanalysis
ofthepreviouslecture,however,hewillprobablynotmakeitbackthesameday(orthesameweek),
since the expected returntime is innite!
The preceding simple analysis is not easily generalized to higher dimensions, where the situa-
tion can be quite dierent, e.g. if the drunk man wanders about in a two-dimensional eld. As
the dimension increases, it makes sense that the walker is less likely to ever nd by chance the
special point where he started. In the next lecture, we will address the eect of dimension in the
return problem (Poly as theorem), but rst we will develop a transform formalism for discrete
random walks on a lattice, analogous to the Fourier and Laplace transform methods used earlier
for continuous displacements. Of course, we could use the same continuum formulation with gen-
eralized functions (like (x)) to enforce lattice constraints, but it is simpler to work with discrete
generating functions right from the start.
Guest lecturer: Chris H. Rycroft (TA).
1

M. Z. Bazant 18.366 Random Walks andDiusion Lecture 17 2


2 Generating Functions on the Integers
Rather than keeping track of a sequence of discrete probabilities, {P(X = n)}, it is convenient
to encode the same information in the expansion coecients, P
n
=P(X +n), of the probability
generating function (PGF), denedas follows,
f()= P
n

n
n=n
0
There are two common cases:
1. For probabilities denedon all integers, n
0
=, the PGF is the analytic continuation of a
Laurent series,
f(z)= P
n
z
n

which converges in some annulus in the complex plane, R


1
< |z|< R
2
. The probabilities are
recovered from the PGF by a contour integral aroundthis annulus once counter-clockwise,
1 f(z)dz
P
n
=
2i z
n+1
When evaluated on the unit circle, z =e
i
, the Laurent series redices to a complex Fourier
series,
P
n
e
in
f(e
i
)=
n=
which is the discrete analog of our previous Fourier transformin space.
2. For probabilities dened on the non-negative integers, n
0
= 0, the PGF is the analytic con-
tinuation of a Taylor series,
f(z)= P
n
z
n
0
which converges in a disk in the complex plane, |z| < R. When evaluated on the real axis
inside the unit disk with z = e
s
(s > 0), the Taylor series ressembles a discrete Laplace
transform,
P
n
e
sn
f(e
s
)=
n=0
which is the discrete analog of our previous Laplace transform in time.
For an intuitive discussion of generating functions and their relation to Fourier and Laplace
transforms in the present context, see Redners book, A Guide to First Passage Processes (recom-
mendedreading).
For the remainder of this lecture, we will focus on case 2, which has relevance for the return
problem in one dimension. In this case, similar to the continuous transforms, the PGF has some
very usefulproperties:
f(1) =

P(X = n) = 1 (by normalization), which implies that the PGF converges


n=0
inside the unit disk, ||< 1.

M. Z. Bazant 18.366 Random Walks andDiusion Lecture 17 3


All of the moments of the distribution are encoded in Taylor expansion of the PGF as
1

(analogous to the Taylor coecients of the Fourier at the origin). For exam-
ple: f

() =

P(X = n)n
n1
= f

(1) = nP(X = n) = X. Similarly,


n=0 n=0
f

()

= n(n 1)
n2
P(X =n) = X
2
X implying that X
2
=f

(1)+f

(1).
n=0
A discrete convolution theorem also holds. SupposeY has probability generating function
g(). Then the PGF of Z =X+Y is

n
h() =

P(X+Y =n)
n
= P(X =i)P(Y =n i)
n
. Letting k=n i
n=0 n=0 i=0
we have h() =

P(X =i) P(Y =k)


i+k
= f()g().
i=0 k=0
For example, consider a Poisson distributionwith parameter :
P(X =n) = e

n
/n!
(1)
= f()= e ()
n
/n!=e

e =e .
n=0
We get f(1) = e
0
= 1 as we expect and f

(1) = = X. If we let Y be Poisson with parameter


, we may dene the variable Z = X +Y. By the last property we get that the PGF for Z
is h() = e
(+)(1)
thus telling us that the sum of two Poisson variables also has a Poisson
distribution (with parameter +).
3 First Passage on a Lattice
DeneP
n
(s|s
0
)astheprobabilityofbeingatlatticepointsafternstepsgiventhatthewalkstarted
ats
0
. Also,deneF
n
(s|s
0
)astheprobabilityofarrivingatsitesforthersttimeonthen
th
step,
given that the walker beginsat s
0
.
One condition we know must hold is
P
n
(s|s
0
)=1
s
since the walker must be somewhere on the lattice. We may also dene
R(s|s
0
)= F
n
(s|s
0
)
n=1
astheprobabilitythatsitesiseverreachedbyawalkerstratingfromsites
0
. Ourinitialconditions
for a walker beginningat s
0
are
P
0
(s|s
0
)=
ss
0
and F
0
(s|s
0
)=0
where is the Kroenecker delta function. Use the following notation for the generating functions:
P(s|s
0
;)= P
n
(s|s
0
)
n
n=0
F(s|s
0
;)= F
n
(s|s
0
)
n
.
n=0
s
0

4 M. Z. Bazant 18.366 Random Walks andDiusion Lecture 17


2in2in
s
The odds of a walker reaching s from s
0
are the same as the odds of a walker rst arriving at s in
j steps and then returning to s in nj steps. Thuswe can write:
n
P
n
(s|s
0
)= F
j
(s|s
0
)P
nj
(s|s)
j=1
as long as n1. In the case that n=0, P
0
(s|s
0
)=
ss
0
so altogether we have
n
P
n
(s|s
0
)=
ss
0

n0
+ F
j
(s|s
0
)P
nj
(s|s)
j=1
By the convolution-like property of PGFs we can now write
P(s,s
0
;)=
ss
0
+F(s|s
0
;)P(s|s;)
P(s|s
0
;)
ss
0
= F(s|s
0
;) = .
P(s|s;)


M. Z. Bazant 18.366 Random Walks andDiusion Lecture 17 5
This is comparable to the result in the continuum case using a Laplace Transform. Using this, we
can write

P(s|s
0
;1)
ss
0
R(s|s
0
) = F
n
(s|s
0
) = F(s|s
0
;1) =
P(s|s;1)
n=1
and thus the probability of returnis
1
R(s
0
|s
0
) = 1 .
P(s|s
0
;1)
4 First Passage Example
Let us now consider a biased Bernoulli walk on the integers.
2m
m
P
2m
(0|0) = p
m
q
m
where we use 2m because a walk that returnsmust do so in an even numberof steps.

2m
m
P(0|0;) = p q
m

2m
m
m=0
)
1/2
= (1 4pq
2
and hence the probability of return is
R(0|0) = 1 1 4pq
= 1 1 4p(1 p)
= 1 (2p 1)
2
= 1 |2p 1|
This result agrees with our rst example, just let p=1/2 and we get R=1.
5 Reection Principle
The following is the beginning of a derivation of the arc-sine law, given as a simulation problem
on the rstproblem set (fraction of the time spent in a given region).
Considera symmetric Bernoulli walk on the integers. Let N(x,t) bethe numberof pathsfrom
(0,0) to (x,t). We know that
t! t
N(x,t) =
(
t+x
=
t+x
)! (
tx
)!
2 2 2
Let X
y
(x,t) be the number of paths which cross y > x. We may create a new path, one which
beginsat the origin andendsat 2y x at time t by simplyreecting the last crossing of a pathin
X
y
(x,t) about the line x=y.
We have just, in eect, deneda bijection between N and X
y
under
X
y
(x,t) = N(2y x,t).
This principle has important applications related to rst passage and return on a lattice. See
forexample, Feller, Vol 1 (1970) fromthe recommendedreading forthe application to the arc-sine
law.
t
6 M. Z. Bazant 18.366 Random Walks andDiusion Lecture 17
4in2.5in
2yx
y
x

You might also like