0% found this document useful (0 votes)
112 views5 pages

Lecture 26: Brownian Motion: X X X X X X X X X X X

The document summarizes a lecture on Brownian motion. It defines Brownian motion as the continuous-time limit of random walks as the number of steps increases. Brownian motion has independent and normally distributed increments, and its sample paths are continuous functions. The document outlines some key properties of Brownian motion, including its construction as a stochastic process on a probability space with continuous sample paths. It also provides examples like time-inversion of Brownian motion and the distribution of the maximum of Brownian motion over an interval.

Uploaded by

spitzersglare
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
112 views5 pages

Lecture 26: Brownian Motion: X X X X X X X X X X X

The document summarizes a lecture on Brownian motion. It defines Brownian motion as the continuous-time limit of random walks as the number of steps increases. Brownian motion has independent and normally distributed increments, and its sample paths are continuous functions. The document outlines some key properties of Brownian motion, including its construction as a stochastic process on a probability space with continuous sample paths. It also provides examples like time-inversion of Brownian motion and the distribution of the maximum of Brownian motion over an interval.

Uploaded by

spitzersglare
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Stat 150 Stochastic Processes

Spring 2009

Lecture 26: Brownian Motion


Lecturer: Jim Pitman

Brownian Motion Idea: Some kind of continuous limit of random walk.


x x x x x x x x x x x

1 N

Consider a random walk with independent steps for some with 1 1 , 2 2 and N steps per unit time. Select = N , so the value of the walk at time 1 has a limit distribution as N . Background: Coin tossing walk with independent 1 steps with SN = X1 + + XN ESN = 0 2 ESN =N If instead we add N , get SN N and
2 2 )2 E(SN N )2 = E(SN N = N N 1 if we take N = 1/ N . 1 2

1 2

Value of our process at time 1 with scaling is then d SN / N Normal(0, 1) as N

according the the normal approximation to the binomial distribution. The same is true for any distribution of Xi with mean 0 and variance 1, by the Central

Lecture 26: Brownian Motion

N steps
x x x

x x x

x N (0, 1) x

1 t

Limit Theorem.

Scaled process at time t has value S tN S tN = N tN Fix 0 < t < 1, N , S


tN

tN N

tN So

N (0, 1) and

tN t N

the normal distribution with mean 0, variance t, and standard deviation t. In the distributional limit as N , we pick up a process (Bt , t 0) with some nice properties:
2 = t. Bt Normal(0, t), EBt = 0, EBt

S tN d Normal(0, t) N

For 0 < s < t, Bs and Bt Bs are independent. Bt = Bs + (Bt Bs ) = Bt Bs Normal(0, t s) (Bt |Bs ) Normal(Bs , t s) For times 0 < t1 < t2 < < tn , Bt1 , Bt2 Bt1 , . . . , Btn Btn1 are independent Normal(0, ti ti1 ) random variables for 1 i n, t0 = 0. This denes the nite dimensional distributions of a stochastic process (Bt , t 0) called a Standard Brownian Motion or a Wiener Process. Theorem: (Norbert Wiener) It is possible to construct such a Brownian motion on a probability space (, F , P), where P is a countably additive probability measure, and the path t Bt (w) is continuous for all w . More formally, we can take = C[0, ) = continuous functions from [0, ) R. Then for a function (also called a path) w = (w(t), t 0) , the value of

Lecture 26: Brownian Motion

Bt (w) is simply w(t). This is the canonical path space viewpoint. This way P is a probability measure on the space of continuous functions. This P is called Wiener measure. Note. It is customary to use the term Brownian motion only if the paths of B are continuous with probability one. Context: Brownian motion is an example of a Gaussian process (Gaussian = Normal). For an arbitrary index set I , a stochastic process (Xt , t I ) is called Gaussian if for every selection of t1 , t2 , . . . , tn I , the joint distribution of Xt1 , Xt2 , . . . , Xtn is multivariate normal. Here multivariate normal i ai Xti has a one-dimensional normal distribution for whatever choice of a1 , . . . , an . Xt1 , Xt2 , . . . , Xtn can be constructed as n (typically dierent) linear combinations of n independent normal variables. Easy fact: The FDDs of a Gaussian process are completely determined by its mean and covariance functions (t) := E(Xt ) (s, t) = E(Xs (s))(Xt (t))

Here the function is non-negative denite meaning it satises the system of inequalities implied by V ar( i ai Xti ) 0 whatever the choice of real numbers ai and indices ti . with the same mean and covariance Key consequences: If we have a process B and B are function as a BM B , then the nite-dimensional distributions of B are continuous, then B a BM. identical. If the paths of B Example: Start with (Bt , t 0) = BM, x a number v 0, look at (Bv+t Bv , t 0). This is a new BM, independent of (Bs , 0 s < v ).
x x x x x x

x x

Time inversion t := Let B tB (1/t) for t > 0 is a BM. Note rst that B is also . Check that B 0 for t = 0 a Gaussian process.

Lecture 26: Brownian Motion

(1) Continuity of paths Away from t = 0, this is clear. At t = 0, does tB (1/t) 0 as t 0? Compute V ar(tB (1/t)) = t2 V ar(B (1/t)) = t 0. With some more care, it is possible to establish path continuity at 0 (convergence with probability one). (2) Mean and covariances E(tB (1/t)) = tE(B (1/t)) = 0 = E(Bt ) Covariances: for 0 < s < t, E(Bs Bt ) = E(Bs (Bs + Bt Bs )) 2 = E(Bs ) + E(Bs (Bt Bs )) =s+0 =s s B t ) = E(sB (1/s)tB (1/t)) E(B = stE(B (1/s)B (1/t)) 1 1 1 since < = st t t s =s Example: Distribution of maximum on [0, t], Mt := sup Bs
0st

Fact: Mt = |Bt | P(Mt ) = 2P(Bt > x) 1 2 2 P(Mt dx) = e 2 x dx 2


x x x x

x
x x x x x x x

x x

x x

x
x x x x x

Lecture 26: Brownian Motion

Sketch a proof: We have B = B . P(Mt > x, Bt < x) = P(Mt > x, Bt > x) = P(Bt > x) by a reection argument. This uses B = B applied to (B (Tx + v ) x, v 0) instead of B (strong Markov property) where Tx is the rst hitting time of x by B .. Add and use P(Bt = x) = 0, to get P(Mt > x) = 2P(Bt > x).
d

You might also like