0% found this document useful (0 votes)
9 views7 pages

Uncertainty Notes

Uploaded by

harshdeep
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views7 pages

Uncertainty Notes

Uploaded by

harshdeep
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Johns Hopkins University What is Engineering? M.

Karweit

UNCERTAINTY—THE DESCRIPTION OF RANDOM EVENTS

A. Some introductory definitions

1. Event/realization: the rolling of a pair of dice, the taking of a measurement, the


performing of an experiment.
2. Outcome: the result of rolling the dice, taking the measurement, etc.
3. Deterministic event: an event whose outcome can be predicted realization after
realization, e.g., the measured length of a table to the nearest cm.
4. Random event/process/variable: an event/process that is not and cannot be made
exact and, consequently, whose outcome cannot be predicted, e.g., the sum of the
numbers on two rolled dice.
5. Probability: an estimate of the likelihood that a random event will produce a certain
outcome.

B. What’s deterministic and what’s random depends on the degree to which you take into
account all the relevant parameters.

1. Mostly deterministic—only a small fraction of an outcome cannot be accounted for


a. The measurement of the length of a table—temperature/humidity variation,
measurement resolution, instrument/observer error; and, at the quantum
level, intrinsic uncertainty.
b. The measurement of a volume of water—evaporation, meniscus, temperature
dependency.

2. Half and half—a significant fraction of an outcome cannot be accounted for


a. The measurement of tidal height in an ocean basin—measurement is
contaminated by effects from wind and boats.

3. Mostly random—most of an outcome cannot be accounted for


a. The sum of the numbers on two rolled dice.
b. The trajectory of a given molecule in a solution
c. The stock market
d. Fluctuations in water pressure in a municipal water supply

C. The “sample space”—the set of all possible outcomes

1. A rolled die: 1, 2, 3, 4, 5, 6 (discrete outcome)


2. A flipped coin: H, T (discrete outcome)
3. The measured width of a room: ? (continuous outcome)
4. The measured period of a pendulum: ? (continuous outcome)

1/11/06 Uncertainty 1
Johns Hopkins University What is Engineering? M. Karweit

D. Statistical regularity—if a random event is repeated many times, it will produce a


distribution of outcomes. This distribution will approach an asymptotic form as the
number of events increases.

1. If the distribution is represented as the number of occurrences of each outcome, the


distribution is called the frequency distribution function.

2. If the distribution is represented as the percentage of occurrences of each outcome,


the distribution is called the probability distribution function.

3. We attribute probability to the asymptotic distribution of outcome occurrences.


a) A “fair” die will produce each of its six outcomes with equal likelihood;
therefore we say that the probability of getting a “1” is 1/6.

E. Description of a random variable X

1. Distribution of discrete outcomes

a) Pr(X = xi) = f(xi) ⇒ the probability that the variable X takes the value xi is a
function of the value xi.

b) f(xi) is called the probability distribution function

f(xi)
f(x)

x1 x xN
Probability distribution function

c) Properties of discrete probabilities

i) Pr(X = x i) = f(xi) ≥ 0 for all i


k k
ii) ∑ Pr( X = x ) = ∑ f ( x ) =1 for k possible discrete outcomes
i =1
i
i =1
i

iii) Pr(a < X ≤ b) = F(b) - F(a) = ∑ f (x )


a < xi ≤ b
i

1/11/06 Uncertainty 2
Johns Hopkins University What is Engineering? M. Karweit

d) Cumulative discrete probability distribution function

Pr(X ≤ x′) = F(x′) = ∑ f ( xi ) , where xj is the largest discrete value of X less than
i =1

or equal to x′. Pr(X ≤ xk) = 1.

1
F(xi)
F(x)

x1 xk
x
Cumulative distribution function

e) Examples of discrete probability distribution functions

i) Distribution function for throwing a die:


f(xi) = 1/6 for i = 1, 6

ii) Distribution function for the sum of two thrown dice


f(xi) = 1/36 for x1 = 2
2/36 for x2 = 3
3/36 for x3 = 4
4/36 for x3 = 5
5/36 for x5 = 6
6/36 for x5 = 7
5/36 for x7 = 8
4/36 for x8 = 9
3/36 for x9 = 10
2/36 for x10 = 11
1/36 for x11 = 12

2. Distribution of continuous outcomes

a) Cumulative distribution function

Pr(X ≤ x) = F(x) = ∫ f (x )dx


−∞

1/11/06 Uncertainty 3
Johns Hopkins University What is Engineering? M. Karweit

1.0
F(x)

x
Cumulative distribution function

b) Probability density (distribution) function (p.d.f.)

f(x) = dF(x)/dx

f(x)
area = 1.0

x
Probability density function

c) Properties of F(x) and f(x)

i) F(-∞) = 0, 0 ≤ F(x) ≤ 1, F(∞) = 1


b

ii) Pr(a < X ≤ b) = F(b) - F(a) = ∫ f (x )dx


a

d) Examples of continuous p.d.f.s

i) “top hat” or uniform distribution:


f(x) = 0 for -a < x < a
f(x) = 1/(2a) for |x| ≤ a

ii) Gaussian distribution:


1 ⎡ ( x − μ) 2 ⎤
f ( x) = exp ⎢− ⎥ , where μ and σ are given constants
2πσ ⎣ 2σ 2 ⎦
iii) Poisson, Binomial are other important named distributions

1/11/06 Uncertainty 4
Johns Hopkins University What is Engineering? M. Karweit

3. Another way of representing a distribution function: moments

k
a) the r moment about the origin: νr =
th
∑xi =1
i
r
f ( xi )
k
i) the 1st moment about the origin is the mean μ = ν1 = ∑x
i =1
i f ( xi )

∑ (x − μ ) f ( xi )
k
r
b) the r moment about the mean μr =
th
i
i =1

∑ (x − μ ) f ( xi )
k
2
i) the 2nd moment about the mean μ2 is the variance σ2 = i
i =1
c) for many distribution functions, knowing all the moments of f(x) is equivalent to
knowing f(x) itself.

4. Important moments

a) the mean μ, the “center of gravity”


b) the variance σ2: a measure of spread.
i) the standard deviation: σ = σ2
μ3
c) the skewness 3 : a measure of asymmetry
[σ ]2 2

μ4
d) the kurtosis : a measure of “peakedness”
(σ 2 ) 2
F. Estimation of random variables (RVs)

1. Assumptions/procedures
a) There exists a stable underlying p.d.f. for the RV
b) Investigating the characteristics of the RV consists of obtaining sample outcomes and
making inferences about the underlying distribution.

2. Sample statistics on a random variable X (or from a large sample “population”)


a) The ith sample outcome of the variable X is denoted Xi .

N
1
b) The sample mean: X =
N
∑X
i =1
i , where N is the sample size.

c) The sample variance : s 2 =


1 N

N − 1 i =1
( Xi − X )
2

d) X and s2 are only estimates of the mean μ and variance σ2 of the underlying p.d.f.
( X and s2 are estimates for the sample. μ and σ2 characteristics of the
population from which the sample was taken.)

1/11/06 Uncertainty 5
Johns Hopkins University What is Engineering? M. Karweit

e) X and s2 are themselves random variables. As such, they have their own means and
variances (which can be calculated).

3. Expected value
a) The expected value of a random variable X is written E(X). It is the value that one
would obtain if a very large number of samples were averaged together.
b) As defined above:
i) E[ X ] = μ, i.e., the expected value of the sample mean is the population
mean.
ii) E[s ] = σ2, i.e., the expected value of the sample variance is the population
2

variance.
c) Expectation allows us to use sample statistics to infer population statistics.
d) Properties of expectation:
i) E[aX + bY] = aE[X] + bE[Y], where a, b are constants
ii) If Z = g(X). then E[Z] = E[g(X)] = ∑ g ( x )Pr( X = x )
all values
x of X

Example: Throw a die. If the die shows a “6” you win $5; else, you lose
a $1. What’s the expected value Z of this “game”?

Pr(X = 1) = 1/6 g(1) = -1


Pr(X = 2) = 1/6 g(2) = -1
Pr(X = 3) = 1/6 g(3) = -1
Pr(X = 4) = 1/6 g(4) = -1
Pr(X = 5) = 1/6 g(5) = -1
Pr(X = 6) = 1/6 g(6) = 5

∴ E[Z] = (-1) * 5 * 1/6 + 5 * 1/6 = 0 , i.e., you would expect to


neither win nor lose

iii) E[XY] = E[X] E[Y] provided X and Y are “independent”, i.e., samples of X
cannot be used to predict anything out sample of Y (and vice versa).

Example: You have error-prone measurements for the height X and


width Y of a picture. What is the expected value of the area of the
picture XY? Answer: E[X] E[Y].

G. Selected engineering uses of statistics

1. Measurement and errors


a) “Best” estimate of value based on error-prone measurements, e.g., three different
measurements give three different answers
b) Compound measurements each of which is error-prone, e.g., the volume of a box
whose sides are measured with error
c) Standard error (standard deviation) of “best” estimate, i.e., error bars
d) Techniques to reduce the standard error

1/11/06 Uncertainty 6
Johns Hopkins University What is Engineering? M. Karweit

i) repeated measurements
ii) different measurement strategy

2. Characterizing random populations


a) Distribution of traffic accidents at an intersection
b) Quantity of lumber in a forest
c) Distribution of “seconds” on an assembly line
d) Voltage fluctuation characteristics on a transmission line

1/11/06 Uncertainty 7

You might also like