0% found this document useful (0 votes)
69 views11 pages

On Powers of Gaussian White Noise: IEEE Transactions On Information Theory October 2010

This document summarizes a research paper that investigates whether powers of Gaussian white noise can be interpreted as white noise through suitable renormalization. It first provides background on Gaussian white noise and how it can be viewed as the limit of a zero-mean, second-order Gaussian process with a flat spectral density as the bandwidth goes to infinity. It notes that white noise is difficult to define mathematically due to its infinite energy. The paper then examines whether nonlinear transformations, such as powers, of white noise can be interpreted even as generalized processes through renormalization, which is important for applications in physics. It shows that under suitable renormalization, integral powers of Gaussian white noise can indeed be interpreted as Gaussian white noise.

Uploaded by

Annas Mahfudz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views11 pages

On Powers of Gaussian White Noise: IEEE Transactions On Information Theory October 2010

This document summarizes a research paper that investigates whether powers of Gaussian white noise can be interpreted as white noise through suitable renormalization. It first provides background on Gaussian white noise and how it can be viewed as the limit of a zero-mean, second-order Gaussian process with a flat spectral density as the bandwidth goes to infinity. It notes that white noise is difficult to define mathematically due to its infinite energy. The paper then examines whether nonlinear transformations, such as powers, of white noise can be interpreted even as generalized processes through renormalization, which is important for applications in physics. It shows that under suitable renormalization, integral powers of Gaussian white noise can indeed be interpreted as Gaussian white noise.

Uploaded by

Annas Mahfudz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/47405457

On Powers of Gaussian White Noise

Article  in  IEEE Transactions on Information Theory · October 2010


DOI: 10.1109/TIT.2011.2158062 · Source: arXiv

CITATIONS READS
4 125

2 authors, including:

Ravi R. Mazumdar
University of Waterloo
217 PUBLICATIONS   5,140 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Flexible Network Control View project

Wireless networks scaling laws View project

All content following this page was uploaded by Ravi R. Mazumdar on 21 May 2014.

The user has requested enhancement of the downloaded file.


On Powers of Gaussian White Noise

A. V. BALAKRISHNAN 1 and Ravi R. MAZUMDAR2

1 Departments of Electrical Engineering and Mathematics, University of California, Los Angeles, Ca.
arXiv:1010.2992v1 [cs.IT] 14 Oct 2010

90024, USA
2 Department of Electrical and Computer Engineering, University of Waterloo, Waterloo, ON N2L

3G1, Canada

October 15, 2010

Abstract
Classical Gaussian white noise in communications and signal processing is viewed as the limit
of zero mean second order Gaussian processes with a compactly supported flat spectral density as
the support goes to infinity. The difficulty of developing a theory to deal with nonlinear transfor-
mations of white noise has been to interpret the corresponding limits. In this paper we show that
a renormalization and centering of powers of band-limited Gaussian processes is Gaussian white
noise and as a consequence, homogeneous polynomials under suitable renormalization remain white
noises.

Keywords: Gaussian white noise, weak distributions, band-limited processes, finitely additive mea-
sures. Asymptotics

Short-title: Powers of white noise

1
1 Introduction
White noise is critical in the development of statistical signal processing and in models for commu-
nication channels. The centrality of the process is that any second order covariance function can be
realized as the result of white noise through a linear filter. In the stationary case white noise is the
basic building block of of constructing optimal filters. In the classical context, white noise is viewed as
a limit of a second order process that has a flat spectral density of compact or finite support (referred
to as the bandwidth of the process) as the support becomes infinite [22, 16, 5]. Such processes cannot
be physically realizable because they would have infinite energy, and yet white noise plays a crucial
role in developing practical filters.
The difficulty of defining white noise is not just because of the infinite energy. Indeed defining a
continuous-time white noise process presents difficulty even in the probabilistic context. This is because
the induced probability measure cannot be countably additive on the space L2 [0, T ] i.e. constructing
a Gaussian process whose covariance function is a Dirac delta function results in the underlying
probability measure being only finitely additive, and thus the white noise map nt (ω) is not a bona fide
random variable [6, 20, 10]. Indeed because of the difficulty of mathematically dealing with white noise,
Balakrishnan in a series of papers [4, 5, 7] developed a finitely additive framework for analyzing white
noise processes and the associated calculus. This was further developed through the idea of liftings
in the work of Kallianpur and Karandikar [17]. An alternative approach exploiting the structure of
abstract Wiener spaces can be found in the work of Kuo [18], and Gross [13, 14] for example, where
the idea is to work through on the lifted space through the lifting map that results working on the
abstract Wiener space.
Classical white noise is only defined as a generalized process, i.e., it does not induce a countably
additive measure on the Hilbert space. However certain transformations of white noise do induce
countable additive measures. This is closely related to the result of Sazonov [20] on the existence of
countable additive measures on Hilbert spaces. In the linear context, any continuous linear operator
such as a kernal operator acting on white noise results R t in the resulting map defining a bona fide
stochastic process. The integral operator Ln(t) = 0 ns ds as a mapping from L2 [0, T ] → L2 [0, T ]
induces a countably additive measure whose extension to C(0, T ) [7, 8] is the Wiener measure [18,
21]. Viewed this way, Gaussian white noise is the derivative of the Wiener process even though
the Wiener process is not differentiable almost surely for any t. However when restricting ourselves
to linear operators we can make sense because in most applications (filtering and communications)
such operators are Hilbert-Schmidt [7, 18] and we can interpret the results both probabilistically as
well as algorithmically. For nonlinear transformations the situation is more complicated. Indeed one
class of nonlinear transformations that induce countably additive measures is that associated with
S-continuous mappings where the S-topology is that which is associated with semi-norms associated
with nuclear operators [7, 2, 3, 9].
There still remains the question whether nonlinear transformations, the powers for example, of
white noise can be suitably interpreted, even as generalized processes [11, 10]. The need for such
interpretation can be found in many applications in mathematical and quantum physics [1, 19] for
example. Indeed, in [19] the authors provide a heuristic justification for treating a renormalization of
a squared white noise term as white noise.
In this paper we show that under a suitable renormalization, integral powers of Gaussian white
noise viewed as the limit of a band-limited Gaussian process with flat spectral density is indeed
Gaussian white noise, a non-trivial fact given that non-linear transformations of Gaussian random
variables are not Gaussian.

2
2 Some preliminaries
Let us first see some of the classical results related to white noise. For simplicity we restrict ourselves
to real-valued processes.R The extension to ℜd valued processes is direct. We denote the inner product
T
in L2 [0, T ] by [f, g]T = 0 f (s)g(s)ds, f, g ∈ L2 [0, T ] and the norm is denoted by [f, f ]T = ||f ||2T . The
space L2 (−∞, ∞) is simply denoted as L2
Let {XW (t), −∞ < t < ∞} denote a zero mean stationary R ∞Gaussian process. Let R(t) =
IE[X(t + s)X(s) denote the covariance and it is assumed that −∞ |R(t)|dt < ∞. By Bochner’s
R∞
theorem there exists a spectral density S(λ), −∞ < λ, ∞ and S(λ) = −∞ R(t)e−i2πλt dt.
Now suppose S(λ) is band-limited and flat as follows:

SW (λ) = 1 , −W ≤ λ < W
= 0 otherwise

Now from the fact that S(λ) has support in [−W, W ] it follows that there exists a spectral process
that is Gaussian and independent on non-overlapping intervals denoted by X̂W (λ) such that for every
Rb RbRb
(a, b), a SW (λ)dλ = IE[ a a X̂(dλ)X̂(dλ′ )] and
Z ∞
X(t) = ei2πλt X̂W (λ)dλ, −∞ < t < ∞ (2.1)
−∞

Moreover the limit in mean square (denoted by q.m)


q.m
lim XW (t) X(t) (2.2)
W →∞ =
exists and is called Gaussian white noise.
In particular we see that such a limiting process process will have the following properties:
RT
i) The random variables y = 0 X(s)f (s)ds will be distributed as N (0, ||f ||2T ) where f ∈ L2 [0, T ].

ii) IE ([f, X]T [g, X]T ) = [f, g]T

iii) Let φi (t) be an orthonormal functions in L2 [0, T ] and consider the collection of random variables
RT
Y = col(y1 , y2 , . . . , yN ) where yi = 0 φi (t)X(t)dt. Then Y ∼ N (0, IN ) where 0 is the N-
dimensional vector of all 0’s and IN is the N × N identity matrix.
RT
iv) IE[ 0 f (s)X(s)dsX(t)] = f (t), 0 ≤ t ≤ T and 0 otherwise.

This is equivalent to saying that {X(t), −∞ < t < ∞} is a zero mean stationary Gaussian process
with covariance R(t, s) = R(t − s) = δ(t − s) where δ(.) denotes the Dirac delta function.
Indeed let Z W
sin 2πW t
RW (t) = SW (λ)ei2πλ dλ = , −∞ < t < ∞ (2.3)
−W πt
and hence for any f ∈ L2 we have:
Z ∞
lim RW (t − s)f (s)ds = f (t) (2.4)
W →∞ −∞

where the limit is in L2 . RW


Clearly such a process is not physically realizable since by Bochner’s theorem R(0) = limW →∞ −W 1.dλ =
∞. in other words its sample paths cannot be in L2 . It is worth remarking that from above: the

3
Rt
process Y (t) = limW →∞ 0 XW (s)ds is a zero mean Gaussian process with variance t or is Brown-
ian motion. The point is that the process X(t) is not well defined and thus X(t) only formally the
derivative of Y (t).
Herein lies the problem. Clearly we can make sense of operations when L2 functions act on X
and in hence problems where white noise is the input to a linear time-invariant system we can give
mathematical meaning by the limiting arguments (in q.m). However even simple nonlinear operations
such as squaring, i.e., Y (t) = X 2 (t) do not make sense because such a process would have infinite
mean and its covariance would be the product of delta functions that is not defined in any meaningful
way.
In the following section we show that the squaring operation does make sense if we perform a
suitable renormalization of the process XW (.) and then the limiting process itself is Gaussian white
noise. This is indeed an unexpected result because squaring a simple Gaussian random variable results
in a chi-squared random variable.

3 Renormalized powers of white noise


First note that from the definition of {XW (t)}, −∞ < t < ∞ is a stationary Gaussian process, we can
represent XW (t) as:
RW (t)
XW (t) = X(0) + νW (t) (3.5)
RW (0)
where νW (t) is a zero mean Gaussian r.v. independent of X(0) and variance given by:
2 (0) − R2 (t)
RW
2 W
IE[νW (t)] = Rν,W (t) = (3.6)
RW (0)

Let us now define the following process:

1 2 2 X 2 (t) − 2W
YW (t) = √ (XW (t) − IE[XW (t)]) = W √ , −∞ < t < ∞ (3.7)
2 W 2 W
Then YW (t) is a centered (mean 0) , renormalized process that denotes the nonlinear transformation
(squaring) of the pre-white noise process. We now state and prove the main result.

Theorem 3.1 Consider the renormalized and centered process YW (t) defined in 3.7 above. Then:

lim YW (t) = Y (t) in L2 (IP) × L2 (3.8)


W →∞

Moreover Y (t) is Gaussian white noise.

We prove the result through the following two results.


Y (t) denote the covariance of Y (t). Then for every f (.) ∈ L
Proposition 3.1 Let RW W 2
Z ∞
Y
lim RW (t − s)f (s)ds = f (t), −∞ < t < ∞ (3.9)
W →∞ −∞

or formally:
Y
lim RW (t) = δ(t) −∞ < t < ∞
W →∞

where δ(.) is the Dirac delta function.

4
Proof:
First note that YW (t) is a w.s.s. process whose covariance denoted by

sin 2πW t 2
 
Y 1 2 1
RW (t) = R (t) =
2W W 2W πt
Next note that by Parseval’s theorem:

sin 2πW t 2
Z ∞  Z ∞
1 1
dt = 1I2 (λ)dλ = 1
2W −∞ πt 2W −∞ [−W,W ]
sin 2πW t 2
1
 R∞
Define the measure dφW (t) = 2W πt dt so that φW (−∞, ∞) = 1 = −∞ φW (t)dt. Hence for
any continuous f (.) ∈ L2 we have:
R∞ R∞ 2
R∞ R∞ 2
−∞ | −∞ f (t + s)dφW (s) − f (t)| dt = −∞ | −∞ (f (t + s) − f (t))dφW (s)| dt
R∞ R∞
≤ −∞ −∞ |f (t + s) − f (t)|2 dφW (s)dt

where we have used Minkowski’s [15] inequality for integrals in the second step noting that dφW (.)
defines a measure that puts mass 1 on (−∞, ∞). Now performing a change of variables by substituting
2πW s = τ we can re-write:
sin τ 2
 
τ 21
R∞ R∞ 2
R∞ R∞
−∞ −∞ |f (t + s) − f (t)| dφW (s)dt = −∞ −∞ |f (t + 2πW ) − f (t)| π τ

1 sin τ 2
≤ ||gW (t, τ )( )||L2 ×L2
π τ
τ
where gW (t, τ ) = (f (t + 2πW ) − f (t)) and L2 × L2 = L2 (−∞, ∞) × L2 (−∞, ∞) with the product
measure defined thereon.
Now , for every fixed τ R∞ 2 2
−∞ |gW (t, τ )| dt ≤ 2||f ||
R∞
and hence using Fubini’s theorem and the fact that −∞ ( sinτ τ )2 dτ = π
R∞ R∞ sin τ 2
−∞ −∞ |gW (t, τ )( )| dtdτ ≤ 2π||f ||2
τ
Moreover |gW (t, τ )| → 0 as W → ∞ for every τ fixed we have that
R∞ R∞ 2
−∞ | −∞ (f (t + s)φW (s)ds − f (t)| dt → 0 as W → ∞

by dominated convergence.

The second result we prove is the convergence to a Gaussian process. For this we need the following
result.

Lemma 3.1 Let {XW (t), −∞ < t < ∞} be a zero mean stationary Gaussian process with bandlimited
spectral density. Let (a, b) and (c, d) be any non-overlapping intervals in ℜ. Let h(.) ∈ L2 . Define the
W (resp. X W ) as X W = b X W (s)h(s)ds (similarly for X W ).
R
random variables Xa,b cd a,b a c,d
Then (XabW , X W ) are asymptotically independent as W → ∞
c,d

5
Proof: To show the result it suffices to show that the random variables are asymptotically uncorrelated
since they are jointly Gaussian by construction.
Z ∞Z ∞
W W
IE[Xab Xcd ] = 1I(a,b) (u)1I(c,d) (v)h(u)h(v)RW (u − v)dudv
−∞ −∞
Hence from applying the result of Proposition 3.1 we have:
Z ∞
W W
lim IE[Xa,b Xcd ] = 1I(a,b) (u)1I(c,d) (u)h(u)h(v)du = 0
W →∞ −∞

since (a, b) and (c, d) are non-overlapping.

W and Y W defined in an
Remark 3.1 From above it readily follows that the random variables Yab cd
analogous way are also asymptotically independent as W → ∞ since they are functionals of the
underlying process X.W on non-overlapping intervals.

Let CW (h) denote the characteristic functional of {YtW }, −∞ < t < ∞ defined as:
W ,h]
CW (h) = IE[ei[Y T
], h ∈ L2 (0, T ) (3.10)
Rt
and [x, y]t = 0 x(s)y(s)ds for x, y ∈ L2 [0, T ], t≤T .

Proposition 3.2
1 2
lim CW (h) = e− 2 ||h||T (3.11)
W →∞

or {YtW } converges to Gaussian white noise in L2 (IP) × L2

Proof
First note that from the asymptotic independence we have:
W ,h] W ,h] W ,h]t+s
lim IE[ei[Y t+s
] = lim IE[ei[Y t
]IE[ei[Y t ]
W →∞ W →∞
Rb
where [x, y]ba = a x(s)y(s)ds.
Now to show that {YtW } converges to Gaussian white noise it is sufficient to show that:
d i[Y W ,h]t ]
dt IE[e 1
lim W = − h2 (t) a.e. in t (3.12)
W →∞ IE[ei[Y ,h]t ] 2
W
Note that from the boundedness of ei[Y ,h]t and the convergence in q.m. of Y.W it is easy to show
that:
d W d W
lim IE[ei[Y ,h]t ] = lim IE[ei[Y ,h]t ]
W →∞ dt dt W →∞
Now:
d W 1 h W W
i
IE[ei[Y ,h]t ] = lim IE ei[Y ,h]t+∆ ] − ei[Y ,h]t ] (3.13)
dt ∆→0 ∆

where the limit on the r.h.s. of (3.13) is to be interpreted in L2 (P ).


Hence:
1 h i[Y W ,h]t+∆ W
i 1 W
 W 
IE e ] − ei[Y ,h]t ] = IE[ei[Y ,h]t eiξ (∆) − 1 ]
∆ ∆

6
where ξ W (∆) = [Y W , h]t+∆
t .
Now we make use of the following identity:
x
1 1
Z
eix = 1 + ix − x2 − s2 ei(x−s) ds
2 2 0

to obtain for each ∆ > 0 by asymptotic independence:


W ,h] W ,h]
lim IE[ei[Y t
ξ W (∆)] = lim IE[ei[Y t
]IE[ξ W (∆)] = 0
W →∞ W →∞

For the second term we have:


W ,h] 2 W 2
lim IE[ei[Y t
ξ W (∆) ] = lim IE[ei[Y ,h]t ]IE ξ W (∆)
W →∞ W →∞

Now from Proposition 3.1 we have:


2
Z t+∆
lim IE ξ (∆) = W
h2 (u)du
W →∞ t

For the third term we obtain:


Z ξ W (δ)
i[Y W ,h]t W (∆)−s)
|IE[e s2 ei(ξ ds]| ≤ IE|ξ W (∆)|3
0
 32
≤ IE[ξ W (∆)|2

Therefore, once again using the result of Proposition 3.1 we obtain:


Z t+∆  22
W 2
 23 2 3
lim IE[ξ (∆)| = h (u)du = O(∆ 2 )
W →∞ t

Therefore combing all the three estimates above we obtain


h i
1
IE ei[Y W ,h]t+∆ ] − ei[Y W ,h]t ] Z t+∆ 
∆ 1 2 3
lim lim = lim h(u) du + O(∆ 2)
∆→0 W →∞ IE[ei[Y W ,h]t ] ∆→0 ∆ t
2
= h (t) a.e.t

Therefore we obtain:
W ,h] 1 2
lim IE[ei[Y T
] = e− 2 ||h||T
W →∞
or the limiting process has a characteristic functional that corresponds to standard Gauss measure on
L2 [0, T ] for every T and hence is Gaussian white noise.
This completes the proof.

A consequence of the above result is that any integral power of white noise should remain white
noise under proper re-normalization. Indeed it is the case and we show this below.
RW (t)
Now let aW (t) = RW (0)
, and so for any integer n, using (3.5) we obtain:
n  
n
apW (t)Xwp (0)νW
n−p
X
n n
XW (t) = (a(t)XW (0) + νW (t)) = (t) (3.14)
p
p=0

7
From the independence of X(0) and νW (t) we have:
n
apW (t)IE[XW
n+p n−p
(0)])2
X
n n n n
RW (t) = cov(XW (t)XW (0)] = (0)]IE[νW (t)] − (IE[XW
p=0
n−1
apW (t)IE[XW
n+p n−p
(0)])2
X
2n
= an (t)IE[XW (0)] + (0)]IE[νW n
(t)] − (IE[XW (3.15)
p=0

Define: n (t) − IE[X n (0)]


n XW W
YW (t) = p n , n≥2 (3.16)
(n − 1)!!(2W ) 2
where (n − 1)!! = (n − 1)(n − 3)(n − 5) . . . 1.

Then we can state the following theorem about higher order powers of white noise.

Theorem 3.2 Let {XW (t)}, −∞ < t < ∞ be a Gaussian process whose spectral density is flat, of
unit power and with support in [−W, W ].
Then the process {YW n (t)}, −∞ < t < ∞ converges to Gaussian white noise in L (IP) × L as
2 2
W → ∞.

Proof: The proof essentially follows from the arguments in the proof of Theorem 3.1. Indeed the
result follows from the fact that for any Gaussian N (m, σ 2 ) r.v. the moment of order n is given by:
IE[(X − m)n ] = (n − 1)!!σ n , (n − 1)!! = (n − 1)(n − 3)(n − 5) . . . 1, n even
= 0, n odd
n is just given by Rn (t) n
Note covariance of YW (n−1)!!(2W )n . Now from 3.15 it can be seen that RW (t) =
W

n (t) where C is a constant.


C.RW
Let f (.) ∈ L2 and consider:
sin 2πW (t − s) n
Z ∞ n (t − s) Z ∞ 
RW 1
n
f (s)ds = f (s)ds
−∞ (n − 1)!!(2W ) (n − 1)!!(2W )n −∞ π(t − s)
Noting that |RW (t)| ≤ RW (0) we have :
sin 2πW (t − s) n
Z ∞ 
1
| f (s)ds| ≤ |RW (0)|n ||f || ≤ C1 ||f || < ∞
(n − 1)!!(2W )n −∞ π(t − s)
Furthermore, see [12] :
∞  n
1 sin 2πW (t − s)
Z
dt = C3 (n) < ∞
(n − 1)!!(2W )n −∞ π(t − s)
Then, we can repeat the arguments as in Propositions 3.1 and 3.2 mutatis mutandis to show that
the normalized process is Gaussian white noise whose variance depends on the constants.

Remark 3.2 This result can be directly extended to homogeneous polynomials P (XW (t)) in the ob-
vious way by defining the renormalization factor to normalize the highest power of the polynomial. It
can be shown the lower order powers do a play a role in the asymptotic limit.

8
Acknowledgement
The research of RM has been supported in part by a grant from the Natural Sciences and Engineering
Research Council of Canada (NSERC) through the Discovery Grant program. He would also like to
thank Patrick Mitran for useful discussions.

References
[1] L. Accardi, U. Franz, and M. Skeide. Renormalized squares of white noise and other non-Gaussian
noises as Lévy processes on real Lie algebras. Comm. Math. Phys., 228(1):123–150, 2002.

[2] A. Bagchi and R. R. Mazumdar. On Radon-Nikodým derivatives of finitely-additive measures


induced by nonlinear transformations on Hilbert space. Nonlinear Anal., 21(12):879–902, 1993.

[3] A. Bagchi and R. R. Mazumdar. Some recent results in finitely additive white noise theory. Acta
Appl. Math., 35(1-2):27–47, 1994.

[4] A. V. Balakrishnan. On the approximation of Itô integrals using band-limited processes. SIAM
J. Control, 12:237–251; errata, ibid. 13 (1975), 975, 1974.

[5] A. V. Balakrishnan. Radon-Nikodym derivatives of a class of weak distributions on Hilbert spaces.


Appl. Math. Optim., 3(2-3):209–225, 1976/77.

[6] A. V. Balakrishnan. Nonlinear white noise theory. In Multivariate analysis, V (Proc. Fifth Inter-
nat. Sympos., Univ. Pittsburgh, Pittsburgh, Pa., 1978), pages 97–109. North-Holland, Amsterdam,
1980.

[7] A. V. Balakrishnan. Applied functional analysis, volume 3 of Applications of Mathematics.


Springer-Verlag, New York, second edition, 1981.

[8] P. d’Alessandro, A. Germani, and M. Piccioni. Relationships between measures induced by Itô
and white noise linear equations. Math. Comput. Simulation, 26(4):368–372, 1984.

[9] A. Gandolfi and A. Germani. On the definition of a topology in Hilbert spaces with applications
to the white noise theory. J. Franklin Inst., 316(6):435–444, 1983.

[10] I.M. Gelfand and N.Ya. Vilenkin. Generalized functions: Applications of Harmonic Analysis.
Fizmatiz., 1961.

[11] I.I. Gikhman and A.V. Skorokhod. The Theory of Stochastic Processes I. Springer-Verlag, 2004.

[12] I.S. Gradshtyn and I.M. Ryzhik. Tables of Integrals, Series, and Products. Academic Press, 1980.

[13] L. Gross. Measurable functions on Hilbert space. Trans. Amer. Math. Soc., 105, 1962.

[14] L. Gross. Harmonic analysis on Hilbert space. Mem. Amer. Math. Soc. No., 46:ii+62, 1963.

[15] G.H. Hardy, J.E. Littlewood, and G. Polya. Inequalities. Cambridge University Press, 1934.

[16] T. Hida and H. Nomoto. Finite dimensional approximation to band limited white noise. Nagoya
Math. J., 29:211–216, 1967.

[17] G. Kallianpur and R. L. Karandikar. White noise theory of prediction, filtering and smoothing,
volume 3 of Stochastics Monographs. Gordon & Breach Science Publishers, New York, 1988.

9
[18] H. H. Kuo. Gaussian measures in Banach spaces. Lecture Notes in Mathematics, Vol. 463.
Springer-Verlag, Berlin, 1975.

[19] M. San Miguel and J.M. Sancho. Theory of nonlinear gaussian noise. Z. Phys. B- Condensed
Matter, 43:361–372, 1981.

[20] V. V. Sazonov. A remark on characteristic functionals. Theory Probab. Appl, 3:188–192, 1958.

[21] O.G. Smolyanov and A. V. Uglanov. Every Hilbert subspace of a Wiener space has measure zero.
Matem. Zametki., 14(3):369–374, 1973.

[22] E. Wong and B. Hajek. Stochastic Processes in Engineering Systems. Springer-Verlag, 1983.

10

View publication stats

You might also like