0% found this document useful (0 votes)
18 views18 pages

Tradeoff Study Process Framework Jun 2014 Cilli Parnell

The document outlines a proposed framework for improving the quality of systems engineering tradeoff studies, emphasizing the importance of a formal decision management process. It integrates decision analysis best practices with systems engineering activities to facilitate better decision-making throughout the system lifecycle. The paper presents a structured process with ten steps aimed at helping decision-makers navigate complex trade-offs and uncertainties effectively.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views18 pages

Tradeoff Study Process Framework Jun 2014 Cilli Parnell

The document outlines a proposed framework for improving the quality of systems engineering tradeoff studies, emphasizing the importance of a formal decision management process. It integrates decision analysis best practices with systems engineering activities to facilitate better decision-making throughout the system lifecycle. The paper presents a structured process with ten steps aimed at helping decision-makers navigate complex trade-offs and uncertainties effectively.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/268820833

4.3.1 Systems Engineering Tradeoff Study Process Framework

Article in INCOSE International Symposium · July 2014


DOI: 10.1002/j.2334-5837.2014.tb03151.x

CITATIONS READS

15 18,125

2 authors:

Matthew V. Cilli Gregory S. Parnell


US Army Armament Research, Development and Engineering Center University of Arkansas at Fayetteville
10 PUBLICATIONS 134 CITATIONS 163 PUBLICATIONS 2,468 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Gregory S. Parnell on 09 May 2018.

The user has requested enhancement of the downloaded file.


Systems Engineering Tradeoff Study Process
Framework
Matthew V. Cilli Gregory S. Parnell
Systems Engineering PhD Candidate Visiting Professor of Industrial
Stevens Institute of Technology, Engineering
Hoboken, NJ 4207 Bell Engineering Center,
[email protected] Department of Industrial Engineering,
University of Arkansas
[email protected]
Copyright  ©  2014  by  Matthew  V.  Cilli.    Published  and  used  by  INCOSE  with  permission.  
Abstract. Tradeoff studies are a critical tool to provide information to
support decision making for discipline engineers, systems engineers, and
program managers throughout the system life cycle. Unfortunately, the
quality of trade studies is inconsistent between organizations and within
organizations. This paper reports on part of an INCOSE effort to improve
tradeoff studies and discusses a proposed INCOSE Decision Management
Process aligned with ISO/IEC 15288. The proposed process discussed in
this paper integrates decision analysis best practices with systems
engineering activities to create a baseline from which future papers can
explore possible innovations to further enhance tradeoff study quality. The
process enables enterprises to develop an in-depth understanding of the
complex relationship between requirements, the design choices made to
address each requirement, and the system level consequences of the sum of
design choices across the full set of performance requirements as well as
other elements of stakeholder value to include cost and schedule. Through
data visualization techniques, decision makers can quickly understand and
crisply communicate a complex trade-space and converge on
recommendations that are robust in the presence of uncertainty.

Introduction
Successful Systems Engineering requires good decision making. Many systems engineering
decisions are difficult decisions in that they include multiple competing objectives, numerous
stakeholders, substantial uncertainty, significant consequences, and high accountability. In
these cases, good decision making requires a formal decision management process. The
purpose of the decision management process, as defined by ISO/IEC 15288:2008, is “…to
provide a structured, analytical framework for identifying, characterizing and evaluating a set
of alternatives for a decision at any point in the life-cycle and select the most beneficial
course of action.” This paper aligns with the structure and principles of the Decision
Management Process Section of the INCOSE Systems Engineering Handbook v4.0 DRAFT
(INCOSE SE Handbook Working Group, 2014) and presents the decision management
process steps as described therein (written permission from INCOSE Handbook Working
Group pending). Building upon the foundation laid by the handbook, this paper adds a
significant amount of text and introduces ten illustrations to provide richer discussion and
finer clarity.

   
New product developments entail an array of interrelated decisions. Table 1 provides a
partial list of decision situations (opportunities) that are commonly encountered throughout a
system’s lifecycle. Many of these decisions may benefit from the holistic perspective of the
systems engineering discipline coupled with a decision model that aggregates and translates
the data produced by engineering, performance, and cost models into terms meaningful to the
various stakeholders, most importantly, the decision makers.  
Table  1  -­‐  Partial  List  of  Decision  Situations  (Opportunities)  Throughout  the  Lifecycle  
Life  Cycle  Stage   Decision  Situation  (Opportunity)  
Exploratory   Assess  Technology  Opportunity  /  Initial  Business  Case    
Research   • Of   all   the   potential   system   concepts   that   could   incorporate   the   emerging   technology   of  
interest,  do  any  offer  a  potentially  compelling  and  achievable  market  opportunity?      
• Of  those  that  do,  which  should  be  pursued,  when,  and  in  what  order?  
Concept   Inform,  Generate,  and  Refine  a  Capability  Development  Document  
• What  requirements  should  be  included?      
• What  really  needs  to  be   accomplished   and  what  is  able  to  be  traded  away  to  achieve  it  
within  anticipated  cost  and  schedule  constraints?      
• How  should  requirements  be  expressed  such  that  they  are  focused  yet  flexible?      
• How  can  the  set  of  requirements  be  demonstrated  to  be  sufficiently  compelling  while  at  
the  same  time  achievable  within  anticipated  cost  &  schedule  constraints?  
Create  System  Architecture  Alternatives  and  Select  Best  
• After   considering   the   system   level   consequences   of   the   sum   of   architecture   design  
choices   across   the   full   set   of   stakeholder   value   (to   include   cost   and   schedule),   which  
architecture  alternative  should  be  pursued?  
Development   Select/Design  Subsystems  
• After   considering   the   system   level   consequences   of   the   sum   of   subsystem   design  
choices   across   the   full   set   of   stakeholder   value   (to   include   cost   and   schedule),   which  
subsystem  alternatives  should  be  pursued?  
Select/Design  Components  /  Parts  
• After   considering   the   system   level   consequences   of   the   sum   of   component   design  
choices   across   the   full   set   of   stakeholder   value   (to   include   cost   and   schedule),   which  
component  alternatives  should  be  pursued?  
Select/Design  Test  and  Evaluation  Methods  
• What  is  the  prototyping  plan?      
• What  tests  and  evaluation  should  be  performed?  
• What  is  the  verification  plan?  
Production   Craft  Production  Plans  
• What  is  the  target  production  rate?      
• To  what  extent  will  low  rate  initial  production  be  utilized?    
• What  is  the  ramp  up  plan?      
• What  production  process  will  be  used?      
• Who  will  produce  the  system?      
• Where  will  the  system  be  produced?  
Operation,  Support   Generate  Maintenance  Approach  
• What  is  the  logistics  concept?  
• What  is  the  preventive  maintenance  plan?      
• What  is  the  corrective  maintenance  plan?    
• What  is  the  spare  parts  plan?  
 

Decision Process Context


A formal decision management process is the transformation of a broadly stated decision
situation into a recommended course of action and associated implementation plan. The
process is executed by a resourced decision team that consists of a decision maker with full
responsibility, authority, and accountability for the decision at hand, a decision analyst with a
suite of reasoning tools, subject matter experts with performance models, and a representative
set of end users and other stakeholders (Parnell G. S., Bresnick, Tani, & Johnson, 2013). The
decision process is executed within the policy and guidelines established by the sponsoring
agent. The formal decision management process realizes this transformation through a

   
structured set of activities described in the balance of this paper. Note the process presented
here does not replace the engineering models, performance models, operational models, cost
models, and expert opinion prevalent in many enterprises but rather complements such tools
by synthesizing their outputs in a way that helps decision makers thoroughly compare relative
merits of each alternative in the presence of competing objectives and uncertainty. (Buede,
2009; Parnell, G. S., Driscoll, P. J., and Henderson D. L., 2011)

Inputs
Inputs to the decision management process are often little more than broad statements of the
decision situation. As such, systems engineers should not expect to receive a well-structured
problem statement as input to the decision management process. In addition, the inputs
usually include models and simulations, test results, and operational data.
Outputs
The ultimate output of the decision management process should be a recommended course of
action and associated implementation plan provided in the form of a high quality decision
report. The decision report should communicate key findings through effective trade space
visualizations underpinned by defendable rationale grounded in analysis results that are
repeatable and traceable. As decision makers seek to understand root causes of top level
observations and build their own understanding of the tradeoffs, the ability to rapidly drill
down from top level trade-space visualizations into lower level analyses and data supporting
the synthesized view is often beneficial.
Process Activities
The decision analysis process can be decomposed into ten process steps: (i) frame decision
and tailor process, (ii) develop objectives and measures, (iii) generate creative alternatives,
(iv) assess alternatives via deterministic analysis, (v) synthesize results, (vi) identify
uncertainty and conduct probabilistic analysis,(vii) assess impact of uncertainty, (viii)
improve alternatives, (ix) communicate tradeoffs, (x) present recommendation and
implementation plan. The process used for each decision can be tailored to the decision
situation.
Process Activities Elaboration
The decision analysis process is depicted in Figure 1 below. The decision management
approach is based on several best practices:
A. Align the decision process with the systems engineering process.
B. Use sound mathematical technique of decision analysis for trade off studies. Parnell
(2009) provided a list of decision analysis concepts and techniques.
C. Develop one master decision model and refine, update, and use it as required for
tradeoff studies throughout the system development life cycle (Parnell et al, 2013).
D. Use Value-Focused Thinking (Keeney, 1992) to create better alternatives
E. Identify uncertainty and assess risks for each decision (Parnell et al., 2013).

   
 

Figure  1:  Decision  Management  Process  Map  


The white text within the outer green ring identifies elements of a systems engineering
process while the ten blue arrows represent the ten steps of the Decision Management
Process. Interaction between the systems engineering process and the Decision Management
Process are represented by the small, dotted green or blue arrows. Note these interactions are
not explicitly addressed here but are the subject of a future paper.
The focus of the process is to find system solutions that best balance competing objectives in
the presence of uncertainty as shown in the center of the figure. This single focus is
important as it can be argued that all systems engineering activities should be conducted
within the context of supporting good decision making. If a systems engineering activity
cannot point to at least one of the many decisions embedded in a systems lifecycle, one must
wonder why the activity is being conducted at all. Positioning decision management as
central to systems engineering activity will ensure the efforts are rightfully interpreted as
relevant and meaningful and thus maximize the discipline’s value proposition to new product
developers and their leadership.
The decision analysis process is an iterative process with an openness to change and adapts as
understanding of the decision and the trade-space emerges with each activity. The circular

   
shape of the process map is meant to convey the notion of an iterative process with
significant interaction between the process steps. The feedback loops seek to capture new
information regarding the decision task at any point in the decision process and make
appropriate adjustments.
Framing Decision & Tailoring Process
The first step of the decision management process is to frame the decision and to tailor the
decision process. To help ensure the decision makers and stakeholders fully understand the
decision context and to enhance the overall traceability of the decision, the systems engineer
should capture a description of the system baseline as well as a notion for how the envisioned
system will be used along with system boundaries and anticipated interfaces. Decision
context includes such details as the timeframe allotted for the decisions, an explicit list of
decision makers and stakeholders, available resources, and expectations regarding the type of
action to be taken as a result of the decision at hand as well as decisions anticipated in the
future. (Edwards et al. 2007) The best practice is to identify a decision problem statement that
defines the decision in terms of the system life cycle. Next, three categories of decisions
should be listed: decisions that have been made, decisions to be made now, and subsequent
decisions that can be made later in the life cycle. Effort is then focused on the decisions to be
made now.
Once the decision at hand is sufficiently framed, systems engineers must select the analytical
approach that best fits the frame and structure of the decision problem at hand. For
deterministic problems, optimization models can explore the decision space. However, when
there are “… clear, important, and discrete events that stand between the implementation of
the alternatives and the eventual consequences…” (Edwards, Miles Jr., & Von Winterfeldt,
2007), a decision tree is a well suited analytical approach, especially when the decision
structure has only a few decision nodes and chance nodes. As the number of decision nodes
and chance nodes grow, the decision tree quickly becomes unwieldy and loses some of its
communicative power. However, decision trees and many optimization models require
consequences be expressed in terms of a single number. This is commonly accomplished for
decision situations where the potential consequences of alternatives can be readily monetized
and end state consequences can be expressed in dollars, euros, yen, etc. When the potential
consequences of alternatives within a decision problem cannot be easily monetized, an
objective function can often be formulated to synthesize an alternative’s response across
multiple, often competing, objectives. A best practice for type of problem it the multiple
objective decision analysis (MODA) approach.
The decision management method most commonly employed by systems engineers is the
trade study, and more often than not employ some form of MODA approach. The aim is to
define, measure, and assess shareholder and stakeholder value and then synthesize this
information to facilitate the decision maker’s search for an alternative that represents the
optimally balanced response to often competing objectives. Major system projects often
generate large amounts of data from many separate analyses performed at the system,
subsystem, component, or technology level by different organizations. Each analysis,
however, only delivers one dimension of the decision at hand, one piece of the puzzle that the
decision makers are trying to assemble. These analyses may have varying assumptions, and
may be reported as standalone documents, from which decision makers must somehow
aggregate system level data for all alternatives across all dimensions of the trade space in his
or her head. This would prove to be an ill-fated task as all decision makers and stakeholders
have cognitive limits that preclude them from successfully processing this amount of
information in their short term memory (Miller 1956). When faced with a deluge of
information that exceeds human cognitive limits, decision makers may be tempted to
oversimplify the trade space by drastically truncating objectives and/or reducing the set of

   
alternatives under consideration but such oversimplification runs a high risk of generating
decisions that lead to poor outcomes.
By providing techniques to decompose a trade decision into logical segments and then
synthesize the parts into a coherent whole, a formal decision management process offers an
approach that allows the decision makers to work within human cognitive limits without
oversimplifying the problem. In addition, by decomposing the overall decision problem into
smaller elements, experts can provide assessments of alternatives as they perform within the
objective associated with their area of expertise. Buede and Choisser put it this way,
These component parts can be subdivided as finely as needed so that the total
expertise of the system design team can be focused, in turn, on specific,
well-defined issues. The analyses on the component parts can then be
combined appropriately to achieve overall results that the decision makers can
use confidently. The benefits to the decision maker of using this approach
include increased objectivity, less risk of overlooking significant factors and,
perhaps most importantly, the ability to reconstruct the selection process in
explaining the system recommendation to others. Intuition is not easily
reproducible. (Buede & Choisser 1992)

MODA approaches generally differ in the techniques used to elicit values from stakeholders,
the use of screening techniques, the degree to which an alternative’s response to objectives
(and sub-objectives) are aggregated, the mathematics used to aggregate such responses, the
treatment of uncertainty, the robustness of sensitivity analyses, the search for improved
alternatives, and the versatility and quality of trade space visualization outputs. If time and
funding allow, systems engineers may want to conduct tradeoff studies using several
techniques, compare and contrast results, and reconcile any differences to ensure findings are
robust. Although there are many possible ways to specifically implement MODA, the
discussion contained in the balance of this paper represents a short summary of best practices.
An anticipated future paper will apply various MODA approaches to a detailed case study in
order to illustrate strengths, weaknesses, and limitations of each approach as well as to
identify potential synergies across approaches.

Developing Objectives & Measures


Defining how a decision will be made may seem straightforward, but often becomes an
arduous task of seeking clarity amidst a large number of ambiguous stakeholder need
statements. The first step is to use the information obtained from the Stakeholder
Requirements Definition Process, Requirements Analysis Process, and Requirements
Management Processes to develop objectives and measures. If these processes have not been
started, then stakeholder analysis is required. Often this begins with reading documentation
on the decision topic followed by a visit to as many decision makers and stakeholders as
reasonable and facilitating discussion about the decision problem. This is best done with
interviews and focus groups with subject matter experts and stakeholders.
For systems engineering trade-off analyses, top-level stakeholder value often includes
competing objectives of performance, development schedule, development cost, unit cost,
support costs, and long term viability. For corporate decisions, shareholder value would be
added to this list. With the top level objectives set, lower levels of objective hierarchy should
be discovered. For performance related objectives, it is often helpful to work through a
functional decomposition (usually done as part of the requirements and architectural design
processes) of the system of interest to generate a thorough set of potential objectives. Start
by identifying inputs and outputs of the system of interest and craft a succinct top level

   
functional statement about what the system of interest does, identifying the action performed
by the system of interest to transform the inputs into outputs. Test this initial list of
fundamental objectives for key properties by checking that each fundamental objective is
essential and controllable and that the set of fundamental objectives is complete,
non-redundant, concise, specific, and understandable. (Edwards et al. 2007) Beyond these
best practices, the creation of fundamental objectives is as much an art as it is a science. This
part of the decision analysis process clearly involves subjectivity. It is important to note
however, that a subjective process is not synonymous with an arbitrary or capricious process.
As Keeney points out,
Subjective aspects are a critical part of decisions. Defining what the decision
is and coming up with a list of objectives, based on one’s values, and a set of
alternatives are by nature subjective processes. You cannot think about a
decision, let alone analyze one, without addressing these elements. Hence, one
cannot even think about a decision without incorporating subjective aspects
(Keeney 2004)

The output of this process step takes on the form of a fundamental objectives hierarchy as
illustrated in Figure 2.

 Figure 2: Example of a Fundamental Objectives Hierarchy for a Hypothetical UAV Decision  


For each fundamental objective, a measure must be established so that alternatives that more
fully satisfy the objective receive a better score on the measure than those alternatives that
satisfy the objective to a lesser degree. A measure (also known as attribute, criterion, and
metric) must be unambiguous, comprehensive, direct, operational, and understandable.
(Keeney & Gregory 2005)
A defining feature of Multiobjective Decision Analysis (also called multiattribute value
theory) is the transformation from measure space to value space that enables mathematical
representation of a composite value score across multiple measures. This transformation is
performed through the use of a value function. Value functions describe returns to scale on
the measure. When creating a value function, one must ascertain the walk-away point on the
objective measure scale (x-axis) and map it to 0 value on the value scale (y-axis). A
walk-away point is defined as the measure score where regardless of how well an alternative

   
performs in other measures; the decision maker will walk away from the alternative. Working
with the user, find the measure score beyond which an alternative provides no additional
value, label it "stretch goal" (also called ideal) and map it to 100 (1 and 10 are also common
scales) on the value scale (y-axis). If the returns to scale are linear, connect the walk-away
value point to the stretch goal value point with a straight line. If there is reason to believe
stakeholder value behaves with non-linear returns to scale, pick appropriate inflection points
and draw the curve. The rationale for the shape of the value functions should be documented
for traceability and defensibility (Parnell et al, 2011). Figure 3 provides examples of some
common value function shapes.

 
Figure  3:    Value  Function  Examples  
In an effort to capture the voice of the customer, system engineers will often ask a
stakeholder focus group to prioritize their requirements. As Keeney puts it,
Most important decisions involve multiple objectives, and usually with
multiple-objective decisions, you can't have it all. You will have to accept less
achievement in terms of some objectives in order to achieve more on other
objectives. But how much less would you accept to achieve how much more?
(Keeney 2002)

The mathematics of Multiobjective Decision Analysis (MODA) requires that the weights
depend on importance of the preferentially independent measure and the range of the measure
(walk away to stretch goal or ideal). A useful tool for determining weightings is the swing
weight matrix. For each measure, consider its importance by determining if the measure
corresponds to a defining capability, a critical capability, or an enabling capability and also
consider the variation measure range by considering the gap between the current capability
and the desired capability and put the name of the measure in the appropriate cell of the
matrix. Swing weights are then assigned to each measure according to the required
relationship rules described in Figure 4. Swing weights are then converted to measure
weights by normalizing such that the set sums to one. (Parnell et al, 2011) For the purposes
of swing weight matrix use, consider a defining capability to be one that directly traces to a

   
verb/noun pair identified in the top level (level 0) functional definition of the system of
interest – the reason the system exists. Consider enabling capabilities to trace to functions
that are clearly not the reason the system exists but somehow allow the core functions to be
executed more fully. Let critical capabilities be those that are more than enabling but not
quite defining.

Figure  4:  Swing  Weight  Matrix  


Generating Creative Alternatives
For many trade studies, the alternatives will be systems composed of many interrelated
subsystems. It is important to establish a meaningful product structure for the system of
interest and to apply this product structure consistently throughout the decision analysis effort
in order to aid effectiveness and efficiency of communications about alternatives. The
product structure should be a useful decomposition of the physical elements of the system of
interest.
Each alternative is composed of specific design choices for each product structure element.
The ability to quickly communicate the differentiating design features of given alternatives is
a core element of the decision making exercise. Figure 5 provides descriptions of fictitious
small UAV alternatives as an example of efficient communication of differentiating design
features. These subsystem design choices have system level consequences across the
objectives hierarchy. Every subsystem design choice will impact system level cost, system
level development schedule, and system level performance. It may be useful to think of
design choices as the levers used by the system architect to steer the system design toward a
solution that best satisfies the elements of shareholder and stakeholder value - the
fundamental objectives hierarchy. These levers are very important and care should be given
in this step of the process to clearly and completely identify specific design choices for each
product structure element for every alternative being considered. Incomplete or ambiguous
alternative descriptions can lead to incorrect or inconsistent alternative assessments in the
process described in the next section. The ability to quickly and accurately communicate the
differentiating design features of given alternatives is critical.
 

   
 
Figure  5:    Description  of  Alternatives  Example  
Assessing Alternatives via Deterministic Analysis
With objectives and measures established and alternatives identified and defined, the decision
team should engage subject matter experts, ideally equipped with operational data, test data,
models, simulation and expert knowledge. The decision team can best prepare for subject
matter expert engagement by creating structured scoring sheets. Assessments of each
concept against each criterion are best captured on separate structured scoring sheets for each
alternative/measure combination. Each score sheet contains a summary description of the
alternative under examination and a summary of the scoring criteria to which it is being
measured. The structured scoring sheet should contain ample room for the evaluator to
document the assessed score for the particular concept against the measure followed by clear
discussion providing the rationale for the score, noting how design features of the concept
under evaluation led to the score as described in the rating criteria. Whenever possible,
references to operational data, test data, calculations, models, simulations, analogies, or
experience that led to a particular score should be documented.
After all the structured scoring sheets have been completed for each alternative/measure
combination, it is useful to summarize all the data in tabular form. Each column in such a
table would represent a measure and each row would represent a particular alternative.
Figure 6 provides a sample structure of such a table, identified here as a consequences
scorecard. Note the table itself includes notional results for the fictitious small UAV
alternatives introduced in the previous section.
 

   
 

Figure  6:    Example  of  Consequences  Scorecard  

Note that in addition to identified alternatives, the consequences scorecard includes a row for
the ideal whose performance measure scores are those that meet the stretch goal for each
objective. The ideal helps us to understand if we need to add new alternatives or new
technologies to existing alternatives to get closer to the ideal and is an element of
value-focused thinking covered later in this section.
Synthesizing Results
At this point in the process the decision team has generated a large amount of data as
summarized in the consequences scorecard. Now it is time to explore the data and display
results in a way that facilitates understanding. Transforming the data in the consequences
scorecard into a value scorecard is accomplished through the use of the value functions
developed in the decision analysis process step described above. In an effort to enhance
speed and depth of comprehension of the value scorecard, consider associating increments on
the value scale with a color according to heat map conventions as shown in Figure 7.

 
Figure  7:    Example  of  a  Value  Scorecard  with  Heat  Map  

This view can be useful when trying to determine which objectives are causing a particular
alternative trouble. In addition, one can use this view to quickly see if there are objectives for
which no alternative scores well. From this view, the systems engineer can also see if there is

   
at least one alternative that scores above the walk-away point for all objectives. If not, the
solution set is empty and the decision team needs to generate additional alternatives or adjust
objective measures.
Beyond the consequence scores for each alternative on each measure, all that was needed to
construct the visualizations covered in Figure 7 were the value functions associated with each
objective. Introducing the weighting scheme, the systems engineer can create aggregated
value visualizations. The first step in assessing an alternative’s aggregated value is a
prescreen for alternatives that fail to meet a walk-away point for any objective measure and
set that alternative’s aggregated value to zero regardless of how it performance on other
objective measures. For those alternatives that pass the walk-away prescreen, the additive
value model1 uses the following equation to calculate each alternative’s aggregated value:

where
v(x) is the alternative’s value,
i = 1 to n is the number of the measure,
xi is the alternative’s score on the ith measure,
vi(xi) = is the single dimensional value of a score of xi,
wi is the weight of the ithmeasure,

and (all weights sum to one).


One such aggregated visualization is the value component graph as shown in Figure 8. In a
value component graph, each alternative's total value is represented by the total length of a
segmented bar. Each bar segment represents the contribution of the value earned by the
alternative of interest within a given measure by the weighted value (Parnell et al. 2011).

 
Figure  8:  Value  Component  Graph  
The heart of a decision support process for systems engineering trade analysis is the ability to
integrate otherwise separate analyses into a coherent, system level view that traces
                                                                                                           
1
 The  additive  model  assumes  preferential  independence.    See  Keeney  &  Raiffa,  1976,  and  Kirkwood,  1997  for  
additional  models.    

   
consequences of design decisions across all dimensions of stakeholder value. The
stakeholder value scatterplot shows in one chart how all system level alternatives respond in
multiple dimensions of stakeholder value.

 
Figure  9:    Example  of  a  Stakeholder  Value  Scatterplot  
Figure 9 is an example of a stakeholder value scatterplot, showing how the six hypothetical
UAV alternatives respond to five dimensions of stakeholder value - unit cost, performance,
development schedule, growth potential, and operation and support costs. Each system
alternative is represented by a scatterplot marker. An alternative’s unit cost and performance
value are indicated by a marker’s x and y position respectively. An alternative’s
development risk is indicated by the color of the circle (green-low, yellow-medium, red-high)
while the degree of growth potential for a particular alternative is shown as the number of
hats above the circular marker (1 hat – low growth, 2 hats – moderate growth, 3 hats – high
growth). Figure 9 depicts an alternative with high operating and support (O&S) costs with a
red dollar sign appearing inside the marker. An alternative with moderate or low O&S costs
would appear with a black dollar sign or no dollar sign respectively.
Identifying Uncertainty & Conducting Probabilistic Analysis
As part of the assessment, it is important for the subject matter expert to explicitly discuss
potential uncertainty surrounding the assessed score and variables that could impact one or
more scores. One source of uncertainty that is common within system engineering trade off
analyses that explore various system architectures is technology immaturity. System design
concepts are generally described as a collection of subsystem design choices but if some of
the design choices include technologies that are immature, there may be lack of detail
associated with component level design decisions that will eventually be made downstream
during detailed design. Many times the subject matter expert can assess an upper, nominal,
and lower bound measure response by making three separate assessments 1) assuming a low
performance, 2) assuming moderate performance, and 3) assuming high performance. Once
the uncertainties have been assessed, Monte Carlo Simulations can be executed to identify the
uncertainties that impact the decision findings and of the uncertainties that are
inconsequential to decision findings.

   
Accessing Impact of Uncertainty - Analyzing Risk and Sensitivity
Decision analysis uses many forms of sensitivity analysis including line diagrams, tornado
diagrams, waterfall diagrams and several uncertainty analyses including Monte Carlo
Simulation, decision trees, and influence diagrams (Parnell et al., 2013). Due to space limits,
only line diagrams of sensitivity to weighting will be discussed in this paper. Anticipated
future papers will discuss potential innovations in the area of sensitivity analysis using Monte
Carlo simulations.
Many decision makers will want to understand how sensitive a particular recommendation is
to weightings and will ask questions regarding the degree to which a particular weighting
would need to be changed in order to change in recommended alternative. A common
approach to visualizing the impact of measure weighting on overall value is by sweeping
each measure’s weighting from absolute minimum to absolute maximum while holding the
relative relationship between the other measure weightings constant and noting changes to
overall score. The output of this type of sensitivity analysis is in the form of a line graph
(Parnell et al. 2011). An example of such a graph for the hypothetical UAV example for the
UAV Range measure is provided in Figure 10 below. Note this particular example shows
how sweeping the weight associated with the UAV Range impacts performance value. The
yellow vertical line in Figure 10 indicates the weight of the UAV Range as determined in
earlier steps in the process. In this case, the best alternative sensitive to the weight assessed
for UAV range. The graph in this example shows the alternatives with the highest
performance value were Dove and Pigeon and as the weight of the UAV Range is increased,
Dove emerges as the high performer and as UAV Range measure range is decreased, Pigeon
is shown to be the alternative with the highest performance value. From this graph, a
decision maker can see the differentiation in Pigeon’s and Dove’s performance value is not
large, regardless of UAV range weight.

 
Figure  10:    Weight  Sweep  Line  Graph  for  Hypothetical  UAV  Example

   
Improving Alternatives
One could be tempted to end the decision analysis here, highlight the alternative that has the
highest total value and claim success. Such a premature ending however, would not be
considered best practice. Mining the data generated for the first set of alternatives will likely
reveal opportunities to modify some subsystem design choices to claim untapped value and
reduce risk. Recall the cyclic decision analysis process map and the implied feedback.
Taking advantage of this feedback loop and using initial findings to generate new and
creative alternatives starts the process of transforming the decision process from
"Alternative-Focused Thinking" to "Value-Focused Thinking" (Keeney 1992). To complete
the transformation from alternative-focused thinking to value-focused thinking, consider
taking additional steps to spark focused creativity to overcome anchoring biases. As Keeney
warns,
Once a few alternatives are stated, they serve to anchor thinking about others.
Assumptions implicit in the identified alternatives are accepted, and the generation of
new alternatives, if it occurs at all, tends to be limited to a tweaking of the alternatives
already identified. Truly creative or different alternatives remain hidden in another
part of the mind, unreachable by mere tweaking. Deep and persistent thought is
required to jar them into consciousness. (Keeney 1993)

To help generate a creative and comprehensive set of alternatives, consider conducting an


alternative generation table (also called a morphological box) (Buede, 2009; Parnell et al.
2011) analysis to generate new alternatives.
Communicating Tradeoffs
This is the point in the process where the decision team identifies key observations regarding
what stakeholders seem to want and what they must be willing to give up in order achieve it.
It is here where the decision team can highlight the design decisions that most influence
shareholder and stakeholder value and which are inconsequential. In addition, the important
uncertainties and risks should also be identified. Observations regarding combination effects
of various design decisions are also important products of this process step. Competing
objectives that are driving the trade should be explicitly highlighted as well.
Presenting Recommendations & Implementing Action Plan
It is often helpful to describe the recommendation in the form of clearly worded, actionable
task list to increase the likelihood of the decision analysis leading to some form of action,
thus delivering some tangible value to the sponsor. Reports are important for historical
traceability and future decisions. Take the time and effort to create a comprehensive, high
quality report detailing study findings and supporting rationale. Consider static paper reports
augmented with dynamic hyper-linked e-reports.

Conclusions
The process discussed in this paper integrates decision analysis best practices with systems
engineering activities to create a baseline from which future papers can explore possible
innovations to further enhance tradeoff study quality. The process enables enterprises to
develop an in-depth understanding of the complex relationship between requirements, the
design choices made to address each requirement, and the system level consequences of the
sum of design choices across the full set of performance requirements as well as other
elements of stakeholder value to include cost and schedule. Through data visualization
techniques, decision makers can quickly understand and crisply communicate a complex
trade-space and converge on recommendations that are robust in the presence of uncertainty.

   
Acknowledgments
This paper was informed by discussions with my colleagues on the INCOSE Decision
Analysis Working Group. The group is chaired by Frank Salvatore (DRC). The members
include Dennis Buede (Innovative Decisions Inc.), Gregory Parnell (University of Arkansas),
and Richard Swanson (DRC).

References
Buede, D.M. & Choisser, R.W., 1992. Providing an analytic structure for key system design
choices. Journal of Multi-Criteria Decision Analysis, 1(1), pp.17–27.
Buede, D.M. The Engineering Design of Systems: Models and Methods, Wiley, 2009
Edwards, W., Miles Jr, R.F. & Von Winterfeldt, D., 2007. Advances in decision analysis:
from foundations to applications, Cambridge University Press.
INCOSE SE Handbook Working Group. INCOSE systems engineering handbook v. 4.0
DRAFT. INCOSE-TP-2003-002-03.2. 2. March, 2014.

Keeney, R.L. and Raiffa H., 1976. Decisions with Multiple Objectives Preferences and Value
Tradeoffs. Wiley. New York, NY.
Keeney, R.L. Value-Focused Thinking: A Path to Creative Decisionmaking. Cambridge,
Massachusetts: Harvard University Press, 1992
Keeney, R.L., 1993. Creativity in MS/OR: Value-focused thinking—Creativity directed
toward decision making. Interfaces, 23(3), pp.62–67.
Keeney, R.L., 2004. Making better decision makers. Decision Analysis, 1(4), pp.193–204.
Keeney, R.L. & Gregory, R.S., 2005. Selecting attributes to measure the achievement of
objectives. Operations Research, 53(1), pp.1–11.
Kirkwood, C. W., 1997. Strategic Decision Making: Multiobjective Decision Analysis with
Spreadsheets. Duxbury Press. (Belmont, CA).
Miller, G.A., 1956. The magical number seven, plus or minus two: some limits on our
capacity for processing information. Psychological review, 63(2), p.81.
Parnell, G. S., Decision Analysis in One Chart, Decision Line, Newsletter of the Decision
Sciences Institute, May, 2009
Parnell, G. S., Driscoll, P. J., and Henderson D. L., Editors, Decision Making for Systems
Engineering and Management, 2nd Edition, Wiley Series in Systems Engineering, Wiley &
Sons Inc., 2011.
Parnell, G., Bresnick, T., Tani, S., & Johnson, E., Handbook of Decision Analysis , Wiley &
Sons, 2013.

   
Biography
  Matthew Cilli is a Systems Engineering Ph. D
candidate at Stevens Institute of Technology in
Hoboken, NJ and leads an analytics group at the U.S.
Army’s Armament Research Development and
Engineering Center (ARDEC) in Picatinny, NJ. Mr.
Cilli graduated from Villanova University, Villanova,
Pennsylvania with a Bachelor of Electrical Engineering
and a Minor in Mathematics in May 1989. He is also a
graduate of the Polytechnic University, Brooklyn, NY
  with a Master of Science - Electrical Engineering
  received in January 1992 and in May 1998 graduated
  from the University of Pennsylvania, Wharton
Business School, Philadelphia, PA with a Masters of
Technology Management.
  Dr. Gregory S. Parnell is a Visiting Professor of
  Industrial Engineering at the University of Arkansas.
He teaches systems engineering, decision analysis, and
operations research courses. He co-edited Decision
Making for Systems Engineering and Management,
Wiley Series in Systems Engineering, 2nd Ed, Wiley &
Sons Inc., 2011; co-wrote the Wiley & Sons Handbook
of Decision Analysis, 2013. Dr. Parnell has taught at
West Point, the United Stated Air Force Academy,
Virginia Commonwealth University, and the Air Force
  Institute of Technology. He is a fellow of the
  International Committee on Systems Engineering
(INCOSE), the Institute for Operations Research &
Management Science, Military Operations Research
Society, the Society for Decision Professionals, and the
Lean Systems Society. He is a retired Colonel in the
U.S. Air Force. Dr. Parnell received his Ph.D. from
Stanford University.

   

View publication stats

You might also like