Uncertainty Notes
Uncertainty Notes
Karweit
B. What’s deterministic and what’s random depends on the degree to which you take into
account all the relevant parameters.
1/11/06 Uncertainty 1
Johns Hopkins University What is Engineering? M. Karweit
a) Pr(X = xi) = f(xi) ⇒ the probability that the variable X takes the value xi is a
function of the value xi.
f(xi)
f(x)
x1 x xN
Probability distribution function
1/11/06 Uncertainty 2
Johns Hopkins University What is Engineering? M. Karweit
Pr(X ≤ x′) = F(x′) = ∑ f ( xi ) , where xj is the largest discrete value of X less than
i =1
1
F(xi)
F(x)
x1 xk
x
Cumulative distribution function
1/11/06 Uncertainty 3
Johns Hopkins University What is Engineering? M. Karweit
1.0
F(x)
x
Cumulative distribution function
f(x) = dF(x)/dx
f(x)
area = 1.0
x
Probability density function
1/11/06 Uncertainty 4
Johns Hopkins University What is Engineering? M. Karweit
k
a) the r moment about the origin: νr =
th
∑xi =1
i
r
f ( xi )
k
i) the 1st moment about the origin is the mean μ = ν1 = ∑x
i =1
i f ( xi )
∑ (x − μ ) f ( xi )
k
r
b) the r moment about the mean μr =
th
i
i =1
∑ (x − μ ) f ( xi )
k
2
i) the 2nd moment about the mean μ2 is the variance σ2 = i
i =1
c) for many distribution functions, knowing all the moments of f(x) is equivalent to
knowing f(x) itself.
4. Important moments
μ4
d) the kurtosis : a measure of “peakedness”
(σ 2 ) 2
F. Estimation of random variables (RVs)
1. Assumptions/procedures
a) There exists a stable underlying p.d.f. for the RV
b) Investigating the characteristics of the RV consists of obtaining sample outcomes and
making inferences about the underlying distribution.
N
1
b) The sample mean: X =
N
∑X
i =1
i , where N is the sample size.
d) X and s2 are only estimates of the mean μ and variance σ2 of the underlying p.d.f.
( X and s2 are estimates for the sample. μ and σ2 characteristics of the
population from which the sample was taken.)
1/11/06 Uncertainty 5
Johns Hopkins University What is Engineering? M. Karweit
e) X and s2 are themselves random variables. As such, they have their own means and
variances (which can be calculated).
3. Expected value
a) The expected value of a random variable X is written E(X). It is the value that one
would obtain if a very large number of samples were averaged together.
b) As defined above:
i) E[ X ] = μ, i.e., the expected value of the sample mean is the population
mean.
ii) E[s ] = σ2, i.e., the expected value of the sample variance is the population
2
variance.
c) Expectation allows us to use sample statistics to infer population statistics.
d) Properties of expectation:
i) E[aX + bY] = aE[X] + bE[Y], where a, b are constants
ii) If Z = g(X). then E[Z] = E[g(X)] = ∑ g ( x )Pr( X = x )
all values
x of X
Example: Throw a die. If the die shows a “6” you win $5; else, you lose
a $1. What’s the expected value Z of this “game”?
iii) E[XY] = E[X] E[Y] provided X and Y are “independent”, i.e., samples of X
cannot be used to predict anything out sample of Y (and vice versa).
1/11/06 Uncertainty 6
Johns Hopkins University What is Engineering? M. Karweit
i) repeated measurements
ii) different measurement strategy
1/11/06 Uncertainty 7