0% found this document useful (0 votes)
16 views6 pages

Week 6

The document discusses probability and statistics concepts including random variables, probability distributions, expectations, moments, and examples of probability distributions like the uniform and Gaussian distributions. It provides definitions and properties of key concepts like the cumulative distribution function, probability density function, mean, variance, and skewness.

Uploaded by

Gautham Giri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views6 pages

Week 6

The document discusses probability and statistics concepts including random variables, probability distributions, expectations, moments, and examples of probability distributions like the uniform and Gaussian distributions. It provides definitions and properties of key concepts like the cumulative distribution function, probability density function, mean, variance, and skewness.

Uploaded by

Gautham Giri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

PROBABILITY AND STATISTICS

Turbulence is full of uncertainty. The statistical equations derived to


date do not yet (even if solved) tell us:

1. How likely would local wind speed exceed some specified value.
2. Connection between velocity fluctuations at different locations
in space, and/or different instants in time
3. Other fluctuating variables: pressure, temperature, vorticity etc

Probability: theoretical study of chance and uncertainty

Statistics: estimation based on finite data and models


For a discussion on the “Random Nature of Turbulence”, see Sec. 3.1
of Pope. Some basic concepts first, separate from fluid mechanics.

Discrete and continuous random variables

Event: something that is unpredictable, or random


Sample space: set of all possible outcomes for an event
Random variable: a real-valued function of sample space, that maps
all possible events into the real line.
Use X as generic name of a random variable; x for sample value.

Ex. Throw of a “fair” die. Possible outcomes are {1, 2, 3, 4, 5, 6}.


P [X = x] = 1/6 if x ∈ {1, 2, 3, 4, 5, 6}; P [X > 6] = 0 (impossi-
bility), P [X ≤ 6] = 1 (certainty). This r.v. X is discrete: it is
restricted to a finite number of possible outcomes.
1
In turbulence, most flow variables are continuous and differentiable.
They thus behave as “continuous” r.v.’s; their ranges may or may
not be restricted.

Characteristics of a single random variable


(Cumulative) Distribution Function (CDF)

The definition (for a random variable X)


F (x) = P (X < x), −∞ < x < ∞
(in Pope’s book, it is F (V ) = P [U < V ]).

Properties, in addition to 0 ≤ F (x) ≤ 1 for all x:

1. F (−∞) = 0: since X ≤ −∞ is an impossible event


2. F (∞) = 1: since X is assumed to be finite.
3. F (x) is a monotonic non-decreasing function of x. That is:
a < b ⇒ F (a) ≤ F (b)
(since, if a < b, {X < a} is a subset of {X < b})

Continuous Random Variables and the PDF)

For continuous random variables the CDF is differentiable


dFX F (x + ∆x) − F (x)
fX (x) = = lim ,
dx ∆x→0 ∆x
which explains the name “probability density function (PDF)”. The
PDF has the following general properties: (i) ´fX (−∞) = 0; (ii)

fX (∞) = 0; (iii) fX (x) ≥ 0 for any x; and (iv) −∞ fX (x)dx = 1.
2
´ b probability of the event {a ≤ X < b} is given by the integral
The
a f (x) dx. If we let a → b, such that the interval [a, b) closes up
and becomes just a point, we find the integral approaches zero. As
a result, we obtain P (X = x) = 0 for any x, or, the probability of
a continuous random variable taking on any discrete value is always
zero. The events {a ≤ X < b}, {a ≤ X ≤ b}, {a < X < b},
{a < X ≤ b} are all equivalent, and P [X < x] = P [X ≤ x].

Expectations and Moments

The words “expectation”, “average” and “mean” are synonymous.


Formally, we write E[...]. For example, E[X] is the mean (often
denoted by µX ); E[(X −µX )2] is the variance. For a discrete random
variable with a finite number of possible outcomes
X
E[X] = xk P [X = xk ]
k

where P [X = xk ] is the probability of the kth possible outcome. For


a continuous random variable it is
ˆ ∞
E[X] = xfX (x) dx .
−∞

This can be generalized to any (“deterministic”) function of X: if X


is random, so is q(X) for a given functional form
ˆ ∞
E[q(X)] = q(x)fX (x) dx .
−∞

If q(X) = X m the result is called the mth order moment of X. Usu-


ally, the improper integrals above converge, although a few patho-
logical forms are known.

3
We are often interested in departures from the mean.
ˆ ∞
E[(X − µX )m] = (x − µX )mfX (x) dx
−∞

is called the mth order central moment. E[X m] is called an absolute


moment. Clearly, E[(X − µX )1] = E[X] − µX = 0 while [E(X −
µX )2] = σ 2 gives the variance. If m is an integer, A general formula
relating central moments to absolute moments is:
0
X
E[(X − µX )m] = mCr E[X r ](−µX )m−r
r=m

where the summation over r is taken backwards, and


m!
mCr ≡ [r! = (r)(r − 1)(r − 2)....(2)(1)]
(r!)(m − r)!

EXAMPLES OF PROBABILITY DISTRIBUTIONS

We discuss only a few continuous r.v’s. More in probability texts.


1. Uniform Distribution

´Let

the PDF f (x) be constant within some interval, [a, b]. To satisfy
−∞ f (x) dx = 1, we have

(
1/(b − a) if a ≤ x < b
f (x) =
0 otherwise
It is a simple matter to verify:
ˆ b
x 1
E[X] = dx = (a + b)
a b−a 2
4
ˆ b
2 x2 1
E[X ] = dx = (a2 + ab + b2)
a b−a 3
1
We also find that E[(X − µ)2] = 12 (a − b)2. An example of such a
r.v. may be an angle, with random orientation within [0, 2π].

2. Gaussian (or “Normal”) Distribution

PDF with a smooth profile, function of µ and σ alone:


"  2#
1 1 x−µ
f (x) = √ exp −
σ 2π 2 σ
This is the so-called “bell-shaped curve”. Very common in applica-
tions where the r.v is the sum of many independent increments.

With a change of variables z = (x − µ)/σ,


ˆ ∞ ˆ ∞
1
√ exp − 12 z 2 σ dz = 1

f (x) dx =
−∞ −∞ σ 2π
where we have used the properties of the “error function”
ˆ x
2
exp −x2 dx

erf(x) = √
π 0
[erf(−x) = −erf(x); erf(−∞) = −1; erf(∞) = 1]

“Central Limit Theorem” (e.g., Sec. 6.5 of T&L, Pope p. 61): (roughly)
Sum of a large number of independent and identically distributed
(iid) random variables tends towards a Gaussian distribution”
(regardless of the shape of PDF of the base r.v.).

In homogeneous turbulence: velocity fluctuations are observed to be


approximately Gaussian distributed.‘
5
Shape of the PDF

May be quantified by normalized 3rd and 4th order moments:

Skewness µ3 = E[(X − µ)3]/σ 3 0 for Gaussian


Flatness µ4 = E[(X − µ)4]/σ 4 3 for Gaussian

In general, µ3 is a measure of departure from symmetry around the


mean.

In general, µ4 is a measure of likelihood of large deviations (scaled


by σ) around the mean

Higher-order moments
ˆ
E[(X − µ)n] = (x − µ)n f (x) dx
The integrand: contributions from values of X far from the mean
(from “tail” portion of PDF) become more dominant at larger n.
6

You might also like