Rowe 1994
Rowe 1994
Rowe 1994
5, I994
Understanding Uncertainty
William D.Rowe’
There is more information we don’t know than we do know for makiig most critical decisions
involving risks. Our focus must be on understanding and effectively dealing with what we don’t
know.As a first step in achieving this focus, a classification of the types of uncertainties that must
be addressed and the sources of these types of uncertainties is presented. The purpose is to provide
a framework for discussion about addressing uncertainty, particularly in risk analyses.
Both uncertainty and variability of information are addressed using four main classes:
1) Metrical uncertainty and variability in measurement,
2) Structural uncertainty due to complexity, including models and their validation,
3) Temporal uncertainty in future and past states
4) Translational uncertainty in explaining uncertain results.
The factors that contribute uncertainty and error to these classes are identified, and their inter-
relationships indicated. Both subjective and objective aspects are addressed.
~~ ~~ ~
Rowe Research and Engineering Associates. Inc., 309 North Alfred These classes may be considered dimensions insofar as they may be
Street, Alexandria, Virginia 22314. displayed in four dimensions as orthogonal axes.
743
u :iw ~ocietyfor w MYS~S
0 2 n - 4 ~ 3 m / i o a o - o 7 4 3 ~ . ~a
744 Rowe
?I
centimeters, but a 5-cm line is measured as 7 cm, then
the measurement is inaccurate by 2 cm.
The translational dimension differs from the other
three in many respects. It occurs after the fmt three have
been considered. People have different perspectives,
goals, and values as well as different capabilities and
levels of training. These affect the interpretation of re-
sults of analyses and, thereby, interact with these inter-
pretations, often contributing to the uncertainty. For
example, is the glass half-full or half-empty?
Table I lists some parameters of the classes shown All four classes are subject to variability in the phe-
in the far left column. The first set of parameters to the nomena of interest. Variability is a contributor to uncer-
right shows the particular information that is unknown tainty in all dimensions. Some of the sources of
and uncertain. Moving to the right, the second parameter variability are listed below.
is used to discriminate between different levels of un- 1. Underlying Variants: These variants are inherent
certainty. This parameter is often described in reverse; in natural systems and contribute to the spread
that is, in terms of the information used to reduce the of parameter values.
uncertainty. The next set of parameters indicates the de-
gree to which the uncertainty has been successfully ad- a. Apparent inherent randomness of nature
dressed, while the last set shows the primary methods (rand var): Random behavior of natural phe-
used for handling the uncertainty. nomena contributes to uncertainty in all three
Future uncertainty is valued by “luck.” Before an dimensions. There are cosmological argu-
event, a probability model of the likely outcomes can be ments that challenge underlying randomness.
constructed. After the event occurs, we consider a de- To avoid these arguments, the apparent ran-
sirable outcome as “lucky” and an undesirable one as dom behavior is adequate to describe varia-
“~nlucky.’~ Selecting a case where an undesirable out- bility pragmatically.
come has low Probability makes good sense, but the un- b. Inconsistent human behavior (hum var): Hu-
desirable outcome can still occur? mans not only are erratic in their behavior,
Past information is subject to different interpreta- but are creative as well.
tions as requirements change or “political correctness” c. Nonlinear dynamic systems (chaotic) behav-
is imposed. The past is not always a useful prologue. ior (chaos var): Small changes in the initial
Structural uncertainty, as a result of complexity, de- conditions of nonlinear dynamic systems lead
pends on both the number of parameters used to describe to very wide variation in system behavior.
a situation and their interaction as well as the degree to The lack of control over initial conditions in
which models of the complex situation are useful. If the real-world cases results in highly variable
models used also correspond to reality to a suitable de- system behavior.
gree, they tend to have higher value. This is particularly
the case when the models can be validated empirically. 2. Collectiveflndividual Membership Assignment
Measurement uncertainty lies in our inability to dis- (mem var): There is an underlying and con-
criminate among values within a parameter, that is, im- founding contributor to variability, namely, the
precision. Accuracy addresses errors made in measuring distinction between a collective and an individ-
within the precision of the measurement capability to ual (that is, collective behavior and a single in-
discriminate. For example, if a ruler is precise only to stance of behavior for a parameter). We can
often provide data about a parameter with great
This case might be considered as “competent error,” the unlucky
outcome of a rare event, as opposed to “negligent error,” knowingly
precision and accuracy (e.g., weight of people in
or unknowingly choosing a high probability for an unwanted out- the United States) but can say little about the
come. behavior or condition (weight) of an individual
Understanding Uncertainty 745
without measuring the individual. This problem model of the likelihood of future outcomes. Differences
is fundamental to all stochastic and behavioral in the choice of the probability model used contribute to
systems, the latter including value conflicts. The the uncertainty in expressing probability levels.
kurtosis of a distribution may be a measure of As seen from Table I, the parameter for discrimi-
variability and confidence levels may provide nating about the likelihood of future states is probability.
some insight to the limits of variability, but the Probability is a model, with a unique characteristic: It
collective/individualconfounding relationship is exists as a description of the likelihood of outcomes
unique and pervades all measurements. It might prior to an event but collapses at the instant of the event
well be called membership uncertainty, reflect- and thereafter. These events are “gambles” with posi-
ing the idea that the individual is member of the tive and negative values assigned to outcomes. Risk is
collective, but its value in the collective cannot the downside of such a gamble. Underlying all proba-
be determined precisely without direct measure- bility models is the belief that the future will behave as
ment. in the past. The degree of belief that a probability model
correctly expresses probabilities of parameters in a sys-
3 . Value Diversity (Val div): Varying perspectives tem4 can be addressed by at least three different pur-
and value systems among people result in irrec- views:
oncilable differences among people that can be
addressed only by political means. (1) A priori: The underlying determinants of
a gamble are known before the
Variability is a special contributor to uncertainty. gamble from an examination of
When addressing parameters with these sources of var- the system, but the outcomes are
iability, the source is identified using the abbreviations unknown.
in parentheses above.
(2) Frequentist: The underlying determinants of
a gamble are estimated by sam-
pling past events or test events.
2. CLASSES OF UNCERTAINTY Test events are those made to di-
rectly obtain information from
new samples (gambles) made for
2.1. Temporal-Future
this purpose.
Uncertainty in future states is probably the most (3) Subjective: The underlying determinants of
familiar class of uncertainty. The future is uncertain; and a gamble are estimated by the
as events become more imminent, anxiety about the out- judgment of the estimator.
come of those events increases. Classical decision mak- The subjective model should not be confused with “sub-
ing under uncertainty deals with future states of nature jective probability,” which is a data ascension strategy
without regard to the likelihood of their occurrence and @ayes theorem) for efficiently and progressively obtain-
attempts to optimize the payoffs should various states ing information about gambles if a prior distribution ex-
occur. The payoffs are expressed as value or utility func- ists.
tions; and when uncertainties in these functions exist,
they may confound the decision. Decision making under ‘There are two types of models addressed here. One is a model of
risk addresses the relative likelihood of alternative states the system, and the second is the model used to determine the prob-
and expected utility payoff strategies. Probability is a ability of stochastic parameters in the system.
746 Rowe
Table II. Sources of Future Temporal Uncertainty A number of sources of future temporal uncertainty
Prediction are shown in Table 11.
0 Apparent inherent randomness of nature (still an open metaphysical
questionbrand var
0 Luck in the short run (unusual combinations of outcomes) 2.2. Temporal-Past
0 Inconsistent human behavior (we often do the unexpected)-hum
var
0 Nonlinear dynamic (chaotic) systems behavior (highly sensitive to
If one has the complete set of historic information
initial conditions)-chaos var required for a given purpose, there is no past temporal
0 Individual versus expected value behavior-memb var uncertainty. Uncertainty in measurement or due to com-
Measurement plexity may of course exist. Past temporal uncertainty
0 Sparse rare events (e.g., low-probabilityhigh-consequence events) arises from one primary source: failure to measure past
0 Rare events imbedded in noise (e.g., new cancers in a population state conditions when they occurred in a manner that can
experiencing many similar cancers from competing causes) be retrieved when needed, that is, failure to record his-
0 Short versus long time frames for observation (e.g., miss long-term
tory. Reconstruction of partially or unrecorded history
trends or filter out short-term excursions)
0 Changing internal system parameters (e.g., genetic drift)
from secondary and other sources involves measurement
uncertainty. Time regression of continuously variable
systems equations (La Grange, Hamiltonian, etc.) in-
volves uncertainty due to complexity as well, since the
Table III. Sources of Past Temporal Uncertainty equations of motions are themselves mathematical mod-
els.
Retrodiction For stochastic discontinuous systems, reconstruc-
0 Incomplete historical data (measurement error)
0 Biased history (bias error)
tion by time regression is not possible. One cannot re-
0 Changing system parameters preventing identical conditions to be construct a probability function after it has collapsed.
revisited (systematic error) Historic information as to relative frequency of occur-
Interpretation of data rence is required as estimated at the time of the event.
0 Hindsight versus foresight (20-20 hindsight) Probability does not exist in the past. Hindsight in the
0 Lack of external data references (validation limited) discontinuous case addresses whether the event occurred
0 Imposition of “political” correctness (systematic bias) or did not occur; it provides no information about past
0 Conflicting reports (from both the viewing position at events and
probability functions itself.
the viewpoint of the observer)
The discrimination parameter is historical data,
Measurement uncertainty which are valued by their correctness. Historical data are
0 Measured parameters inappropriate for the purpose used (lack of
correct when they were recorded for the purpose now
correctness)
sought. Lack of correctness implies measurement uncer-
tainty. Retrodiction is a term sometimes used for recon-
struction by time regression. Sources of past temporal
uncertainty are shown in Table 111.
Before the gamble, the valuation parameter for the
probability rnoder is the “confidence” in how well the
model will predict outcomes. Confidence is a valuation 23. Metrical-Uncertainty in Measurement
property of models, not of future states.
The valuation parameter for future state gambles is
Measurement is a means to gain information about
“luck.” Luck, as used here, implies a favorable outcome
the world as we sense it. We make observations about
of a gamble to a “player.” It is the player who places
the empirical world using nominal, ordinal, cardinal, or
a value on the outcome. ratio scales. In each case we discriminate on the basis
Although one would prefer clairvoyance, that is,
of the precision of each type of scale, that is, the mini-
certainty about the outcome of a future event, prediction
mum unit of measurement for which one can discern
(or projection) about future outcomes is the method we
differently from one unit to another using whatever
use to apply probability information to cope with future
measurement tools are available.
temporal uncertainty.
Accuracy addresses how correctly we have meas-
Confidence is used in two senses: (1) a degree of belief that a model
ured and interpreted measurement about scale values.
representation is valid and (2) the likelihood that a statistic will lie Measurement involves taking multiple observations of
within prescribed limits. It is used here in the first sense. scale values, and the use of statistical models to describe
Understanding Uncertainty 747
Table IV. Sources of Measurement Uncertainty ment process model is often itself one of the sources of
Empirical observations
uncertainty in measurement listed in Table IV.
0 Apparent randomness of nature-ran var
0 Precision of measurements (resolution of instruments)
0 Accuracy of measurements (quality of measurement) 2.4. Structural-Uncertainty Due to Complexity
0 Measurement interaction (Heisenberg uncertainty principle)
0 Systemic measurement errors (measurement bias)
Complexity involves the number of degrees of free-
Interpretation of observations dom in a system and how the parameters that express
0 Data inclusiodexclusion judgments
the degrees of freedom interact. Linear and nonlinear
0 Sample size adequacy
0 Objective versus subjective methods for sampling (fixed sample s u e
optimization theory with single and multiple objectives
vs Bayes methods) is a means to address complex systems. When systems
0 Objective versus subjective methods for statistical analysis and re- become too complex to deal with all parameters directly,
duction simplification of one or more parameters is necessary.
Interpretation of measurements The result is a model, an abstraction of the system stud-
Differences in judgment and interpretation by experts (Is the glass ied.
half-empty or half-full?) The discrimination parameter for models is useful-
0 Biases, beliefs, and dogma of experts
ness of the model. Some models are more useful than
(i) Bias error in making observations
(ii) Slants in interpretation of results others regardless of how well they represent reality. For
example, a model that is used successfully to get parties
with different viewpoints to define what they agree and
disagree upon will have a high utility, although it may
Table V. Sources of Uncertainty Due to Complexity have little to do with reality; it may be purely abstract.
The valuation parameter for models is the confidence
Systemic fluctuations
0 Inherent random processes-ran var
one has that the modeled system is properly represented.
0 Inconsistent human behavior-hum var The more that one is convinced that a model is repre-
sentative of the complex system, the more confidence
Parameter interaction
0 Number of systemic degrees of f n e d o m 4 e g r e e of complexity
one has that the model is valid. If the system modeled
0 Completeness of parameter identification is a real system, the validity of the model is measured
0 Interdependence of parameters empirically.
0 Initial conditions in chaotic systems--chaos var .Even if a model cannot be empirically validated, it
Interpretation of models may still be useful. For example, agreement on a hypo-
0 Differences in judgments and interpretation by experts thetical structure may provide a means of communicating
0 Biases, beliefs, and dogma of experts and proponents ideas even when they are known to be invalid, or an
Model choice uncertainty analogy may be a useful means to convey information
0 Oversimplified models even though the analogy is known to be incorrect. Thus,
0 Inability to verify the validity of alternative models empirically models can be either valid and useful or both. In any case,
0 Selection of models that support preestablished hypotheses
the degree of confidence that one has about the validity
0 Inauurouriate analoeies
or utility of model is the means of discrimination for com-
plexity. Probability has no meaning here. It is meaningful
only in the future temporal class. Of course, structural and
the results. Statistical models, therefore, always address temporal uncertainty can occur simultaneously, and mod-
a plurality of observations, and a single observation has els of future conditions may be addressed. Table V lists
little meaning. Statistical models are valued by our con- some sources of structural uncertainty.
fidence in the model of the underlying process by which It has been said that we ought to understand sim-
the data are generated. Increased sample size increases plicity before we can understand complexity. Perhaps,
confidence in the modeling of the process. The descrip- simplicity is nothing more than a very simplified and
tion of the process model is termed a “frequency dis- pragmatic model of reality where only the critical par-
tribution.”6 It is a structured listing of the relative ameters for the purpose at hand are preserved. Simple
occurrence of historical measurements. The measure- models are useful, may or may not have empirical va-
lidity, and describe common perspectives. Oversimpli-
‘This should not be confused with a probability distribution, which fication of models leads to misuse and invalid
involves temporal uncertainty. application.
748 Rowe
Approach Limitations
Scientific risk
Provide the best estimate of risk and the Tend to measure that which is measurable
ranges of uncertainty above and below the best estimate rather than that which is critical but difficult to measure
Regulatory risk
Assure, with a given degree of confidence, Very high margins of safety, may have
that the actual risk does not exceed the risk estimate high cost and preclude some beneficial activities
Design engineering
Use conservative, proven designs to produce Very high margins of safety, may have high cost, and are subject
engineering structures with minimum liability to unwelcome surprises when applied in new environments
Performance management
Risks are one parameter used in balancing Risks cannot be measured, only modeled.
risk, costs, benefits, and performance of engineered Empirical verification of performance is often very difficult.
structures. Uncertainty is addressed in all
four parameters
2.5. Translational-Uncertainty Due to safety, without sacrificing high confidence that the actual
Communication risk remains below the estimate, is an approach to cor-
rect the situation.
When results of an analysis are completed, they Design engineers ignore risk and uncertainty. They
address structural complexity in trying to put subsystem
must be presented to decision makers, professionals,
stakeholders, and the public. All have different levels of designs together to fulfill an overall design requirement.
training and capability of understanding the results. All They address cost-effective design and hope that ade-
have conflicting goals and values and view the analysis quate margins of safety have been included in standard-
from their own perspective. Practitioners of different ized and tested designs. Reduced uncertainty in system
complexity is sought.
professions have differing perspectives on how they ad-
dress uncertainty. The methods of treating uncertainty in As cited above, technicians deal with certainties,
avoiding consideration of risks and uncertainty. Man-
measuring risk are an example as shown in Table VI.
agers of technical systems deal with uncertainties and
All of these perspectives are valid. They are for different
must take prudent risks in obtaining system perform-
purposes. Unless a means of translation is available,
ance; it’s a way of life. Managers seek to reduce uncer-
practitioners from different backgrounds can be misun-
tainty in all three dimensions but want to do so
derstood and unable to communicate. It is important to
selectively and cost-effectively.
recognize these differences in perspective when people
holding different points of view attempt to communicate Dealing interchangeably with local, state, federal,
about uncertainty with each other. and international organizations also leads to translational
uncertainty. Value diversity, leading to unwarranted po-
Each type of practitioner above attempts to reduce
larization of issues, is another case. These types of un-
uncertainty differently. The scientist reduces uncertainty
certainty are introduced by social behavior, and they are
by obtaining more and better measurements, using these
for obtaining higher confidence in models and better pre- separate from the uncertainties in the problems ad-
diction or retrodiction. dressed. Nevertheless, these uncertainties are real and
often dominant.
The regulator, realizing that neither better measure-
ment nor empirical validation of risk models will resolve
uncertainties in the short time required for regulatory
action, uses margins of safety (contingencies) to assure 2.6. Understanding Uncertainty and Variability
that the actual level of risk lies below the estimate. These
margins of safety multiply when many such parameters The above classification, description, and listing of
are estimated for use in risk models. This results in very sources of uncertainty are only a means to understand
high levels of conservatism. In the absence of effective the contributors to uncertainty for discussion. They are
measurements, the removal of redundant margins of not meant to be exhaustive or definitive, but provide a
Understanding Uncertainty 749
basis for understanding how terminology is often mis- However, information is valued for the purpose in-
used, resulting in increased confusion. For example, tended, and different approaches for different perspec-
probability distributions address only future temporal tives are necessary.
uncertainty. They are improperly used otherwise. Statis- Perhaps a marriage between the regulatory and the
tical distributions are used to describe measurements, performance management approach in Table VI would
and confidence levels are used to describe the validity be desirable. It would address health and safety from a
of models. conservative approach, using margins of safety; but it
Membership variability becomes dominant when would additionally stress the quantification of margins
one proceeds from a statistical distribution to an indi- of safety and their costs. This would provide a basis for
vidual member of the distribution. Confidence levels, balancing costs risk and confidence in risk estimates
however, are a measure of belief and are basically sub- from a rational decision making framework.
jective. Variability in confidence levels arises from con- There are many elements of uncertainty that are ir-
flicting beliefs and values as well as differing viewpoints reducible. Limitations in measurement as expressed by
and perspectives. the Heisenberg uncertainty principle, inability to verify
models empirically, and measurement of parameters of
a “trans-science” nature are examples of theoretical
3. PERSPECTIVES ON UNCERTAINTY limitations. There are more practical limitations: time
and resources. Moreover, tradition and social values do
not necessarily yield to scientific argument and evi-
3.1. Impending Uncertainty dence.’
as a result, but can only be mentioned here. Four partic- 3. Use of rangelconfidence estimates rather than
ular areas are now being addressed: point estimates for describing our knowledge
about uncertain situations. These are measure-
1. Classifying various methodologies for ad- ment frameworks used to extract both informa-
dressing uncertainty and risk and how and where tion and the uncertainty involved in a rigorous
these methodologies are used with respect to the manner. The combination of range estimates in
dimensional characteristics that best describe complex models involves computations whose
their nature. Identification of common sources of validity must be examined.
error for each case will also be of interest. 4. Better means of displaying uncertain informa-
2. A better understanding of the individual pro- tion such that different perspectives can be un-
pensity to gamble in terms of human behavior derstood and preserved, margins of safety and
both under analytical conditions and for behav- their associated costs can be made explicit, and
ioral responses as uncertain situations become decision makers and the public can understand
more imminent. The implications for individual the implications of uncertainty of alternative ac-
and group decision making will be explored as tions.
well.