0% found this document useful (0 votes)
13 views43 pages

PW07 Model Experimentation Analysis

The document discusses model experimentation and analysis. It covers topics like sensitivity analysis, scenario analysis, and experimentation in discrete and continuous event modeling. The document provides guidance on conducting simulation experiments and analyzing the results.

Uploaded by

aurora
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views43 pages

PW07 Model Experimentation Analysis

The document discusses model experimentation and analysis. It covers topics like sensitivity analysis, scenario analysis, and experimentation in discrete and continuous event modeling. The document provides guidance on conducting simulation experiments and analyzing the results.

Uploaded by

aurora
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

Model Experimentation and Analysis

Ruang L301-302
Gedung Laboratorium
Departemen Teknik Industri
Fakultas Teknik
Universitas Indonesia
systems.ie.ui.ac.id
Agenda

• Orient to Model Usage


• Experimentation Principles
• Experimentation Analysis: Sensitivity vs Scenario
• Experimentation in Discrete Event Modeling
• Experimentation in Continuous Event Modeling
Steps in Simulation Modeling

• Problem Formulation
• Goal Setting
• Model Conceptualization
• Data Collection
• Model Translation
• Verification and Validation
• Experimental Design
• Production Runs and Analysis
• Documentation/Reporting
• Implementation

12
Modeling Processes (Sterman 2000)
• Results of any step can yield insights that lead to revisions in any earlier step (indicated by
the links in the center of the diagram).

1. Problem Articulation
(Boundary Selection)

The modeling
5. Policy process is
2. Dynamic iterative.
Formulation
Hypothesis
& Evaluation

4. Testing 3. Formulation
Orient your analysis to the 2 types of model Usage

• Explanatory Models: to explain – revealing relevant ideas


• theories, could leads to prediction, because theories tend to be static in the future
• Exploratory Models: to explore – playing with alternatives
• build intuitions, discover novel insights, test hypotheses and different policy
alternatives

Predictive Modeling Exploratory Modeling


“ trying to predict the unpredictable ” “ the search for insight ”

“Exploratory Modeling and the Use of Simulation for Policy Analysis” — Bankes(1993)
Approaches Comparison on Exploratory and Explanatory

Exploratory Explanatory

Provide learning about the given


Provide a useful description and
situation, its structure, and identify its
explanation of why phenomenon is the
Purpose key concerns, and generate resolution
way it is in supporting a more accurate
possibilities in supporting strategic,
decision making in the real world.
tactical, and managerial choices.

The main model is consist of sub-


models that built along the way from Need to fit well with enough to a
the scratch on this interim model sufficient portions of knowledge, and
Requirements portions to gather insights to guide observation regarding the phenomenon
further model consolidation and so this model can be useful.
improvement.

Rather a set of inter-related, not A model that is able to relate with the
Output necessarily tightly-coupled, "semi- reality although cannot be claimed as
finished" simpler exploratory model absolute reality.
Purposes of Simulation Revisited

• To explore alternatives
• To improve the quality of decision making
• To enable more effective planning
• To improve understanding of the business
• To enable faster decision making
• To provide more timely information
• To enable more accurate forecasts
• To generate cost savings
Repetition vs Replication
• Be careful of different meanings at different context

• By definition: Repeat means do


something again and again. Replicate
means make an exact copy of; or re-
produce
• In experiments activities:
• Repetition is conducting the experiments in
your own model, replication is when you
model is re-produce by someone else and
runs them
• Repetition is conducting multiple trials of the
experiments, Replication is redo the entire
experiments again
• Repetition: Precision, Replication: Accuracy
However in computer simulation
experiments, we usually us replications,
because we are using statistical
distributions to randomize the experiments
Experimentation Tips

• Random input leads to random output • How many runs should I perform? -
NEVER just once or twice
(RIRO) • Computing power
• Run a simulation (once) — what does it • Confidence of the Results (variability of the
model)
mean? • Trial and Error
• Was this run “typical” or not?
• Creating Scenarios based on Problems
• Variability from run to run (of the same Questions (How many?) – usually 3
model)? • Pessimism
• Optimal
• Need statistical analysis of output data • Ideal
• Time frame of simulations • Be familiar with the reports capability
• Terminating: Specific starting, stopping of each simulation applications
conditions • Statistical Module (Output and
Graphs)
• Steady-state: Long-run (technically
forever) • Ability to export to different type of
files (txt, xls, etc) for further statistical
• Here: Terminating analysis
• Output Analyzer
• Always return to problem articulation
and the Dynamic Hypothesis for • Be careful with information overload,
Experimentation focus on answering problem
questions
Agenda

• Orient to Model Usage


• Sensitivity Analysis
• Scenario Analysis
• Experimentation in Discrete Event Modeling
• Experimentation in Continuous Event Modeling

Questions model users should ask but


usually don’t
(Sterman, 2000, p852)
Sensitivity Analysis

• Concepts of Leverage
• At which points in space we can use a rod with a rabbit at on end to move the earth
at the other end?

Leverage Points
Sensitivity Analysis

• the consequences of changes in


model inputs are assessed
• It is consuming process, especially
if there are many model inputs. For
this reason, sensitivity analysis
should be restricted to a few key
inputs.
Sensitivity Analysis
By which variables the “results” vary greatly?

• Numerical Sensitivity
• A change in assumptions change the numerical values of results
• Behavior Mode Sensitivity
• Change the patterns of behvaior generated by model
• Policy Sensitivity
• Reverse the impacts or desirability of a proposed policy
Agenda

• Orient to Model Usage


• Sensitivity Analysis
• Scenario Analysis
• Experimentation in Discrete Event Modeling
• Experimentation in Continuous Event Modeling
Scenario Builder as Pathfinder Tales
• Figuring out the Future and its path is challenging

History of Scenario What Scenario


• Scenario started from War to • Scenario is not Sensitivity
Anticipate Future Enemy Analysis
Movements
• Scenario is Creating Context for
• Predicting not just the end, but single Sensitivity Analysis or
possible multiple paths to get to multiple variables changes
the end
• Scenario is story based, narrative
• Why multiple paths? Risk of
Failure
• End Results in scenario analysis is
not accurate picture of tomorrow,
but better decision about the
future
Scenarios

What is Scenario?
• Sequence of alternative images of • Scenarios are not about predicting the
how the future may unfold, created future, rather they are about
from mental or formal models, which perceiving the futures in present
reflects the integration of past, present
and future developments (Rotmans et • They are not Prophecies, and they are
al.2000) not Predictions based on “Gut Feelings”

• Plausible alternative futures (not • Plausible, not Possible


necessarily the most probable) about
the co-evolutionary pathways of human
and ecological systems
• Narratives of the future evolution of
previously planned actions with the
purpose of focusing on causal
processes and decision points
(Alcamo, 2001; Kahne Wiener, 1967) World of Facts
World of
(the-past)
Perceptions
(the-future)
What is the Purpose of Scenarios?

• Policy analysis, providing a picture of • Synthesizing information about possible


future alternative states of human futures, including both ‘qualitative’
and ecological systems in the (e.g. in the form of
absence of additional policies narratives/storylines, diagrams or other
(‘baseline scenarios’) and comparing visual symbols) and ‘quantitative’
these with the future effects of scenarios (e.g. providing information
environmental protection policies in the form of tables and graphs,
(‘policy scenarios’) usually based on the results from
computer models)
• Raising awareness about emergent
problems and about possible future • Dealing with uncertainty and
interrelationships between different complexity, by confronting decision-
issues makers with the present lack of
knowledge about system conditions
• Broadening perspectives on certain and underlying dynamics, thus
themes, accounting for larger time rendering more transparent and
and spatial scales of analysis, and precautionary decision-making processes
highlighting consequences of strategic
choices in • Promoting public participation,
allowing for integrating the normative
dimensions of sustainability, widening
the knowledge base, developing a
common language and enhancing
mutual learning
Scenarios are Stories
Stories or Narrative

• Narrative Stories that give Typical Plots


meanings to Events
• Challenges and Response
(Contextualizing)
• Winners and Losers
• Events are interconnected in Plots
• Evolution
• A good Stories Plots requires:
• Stage • Revolution
• Actors • Cycles
• Props
• Infinite Possibilities
• Framing the AUDIENCEs
Perceptions

When human ask an ultra intelligent computer: After long bleeps and blinks, the computer answer
“Do you compute that you will “That reminds me of a story”
ever think like a human being?”
Steps of Scenarios Development

Generic Steps of Scenario Building Building Blocks of Scenarios

• Identify Focal Issues or Decisions • Driving Forces


• Identify Key forces Influencing Issues • Predetermined Elements
or Decisions = Driving Forces • Base Year, Time Horizons, Time Step
• Rank by Importance and Uncertainties • Critical Uncertainties
• Selecting Scenarios Logics • Based on predetermined elements
• Fleshing-Out (Detailing) Scenarios for • Created Events or Scenes
Sub-Plots and Implications • Comprising a Plot
• Narrative, Pictures and Numbers
• Selection of Indicators and Signposts
for Scenario Paths
(Schwartz, 1996)
Dimensions on Uncertainty and Its Impacts

• Level of Uncertainties • Level of Impacts

Not
Knowing Knowing
what we what we Better Worse
don’t know do not
know

Knowing
Not Chang
Knowing Same
what we
what we e
know
know
Things to Consider in Developing Scenarios

• The Purpose of Modeling: What it would be used for, from User needs
• Defining the Key Variables : from output to input and the variable variations
• The Capabilities of the Software that you Used
• The Capabilities of the Hardware
• Resources Limitation: Time, Expertise, Money etc.
Agenda

• Orient to Model Usage


• Experimentation Principles
• Experimentation Analysis: Sensitivity vs Scenario
• Experimentation in Discrete Event Modeling
• Experimentation in Continuous Event Modeling
Discrete Event Modeling Experimentation
Dealing with Initialization Bias: Warm-up & Initial Conditions

• Graphical methods: involve the visual inspection of time-series of the output


data.
• Heuristics approaches: apply simple rules with few underlying assumptions.
• Statistical methods: rely upon the principles of statistics for determining the
warmup period.
• Initialization bias tests: identify whether there is any initialization bias in the
data. Strictly these are not methods for identifying the warm-up period, but
they can be used in combination with warm-up methods to determine whether
they are working effectively.
• Hybrid methods: these involve a combination of graphical or heuristic methods
with an initialization bias test.
Warm Up and Run Length

Most models start empty and idle Remedies for initialization bias
• Better starting state, more typical of steady
• Empty: No entities present at time state
0 • Throw some entities around the model
• Idle: All resources idle at time 0 • Can be inconvenient to do this in the
model
• In a terminating simulation this is • How do you know how many to throw
and where? (This is what you’re trying
OK if realistic to estimate in the first place.)
• In a steady-state simulation, • Make the run so long that bias is
overwhelmed
though, this can bias the output for • Might work if initial bias is weak or
a while after startup dissipates quickly
• Bias can go either way • Let model warm up, still starting empty and
• Usually downward (results are idle
biased low) in queueing-type • Simulate module: Warm-Up Period
models that eventually get (time units!)
congested • “Clears” all statistics at that point for
summary report, any cross-replication
• Depending on model, parameters, data saved with Statistics module’s
and run length, the bias can be Outputs area (but not Time-Persistent or
Tallies)
very severe
Warm Up and Run Length (cont’d.)

• Warm-up and run length times?


• Most practical idea: preliminary runs, plots
• Simply “eyeball” them
• Statistics module, Time-Persistent and Tallies areas, then Plot with Output Analyzer
• Be careful about variability — make multiple replications, superimpose plots
• Also, be careful to note “explosions”

• No explosions
• All seem to be settling
into steady state
• Run length seems
adequate to reach
steady state
• Hard to judge warm-up
...
Warm Up and Run Length (cont’d.)

• “Crop” plots to time


0 - 5,000
• Plot dialog, “Display
Time from … to …”
• Conservative warm-
up: maybe 2,000
• If measures
disagreed, use max
warm-up
Discrete Event Modeling Experimentation
Selecting the Number of Replications and Run-Length

• Rule of Thumb: at least three to five replications


• Graphical Methods

• Confidence Interval Methods


Truncated Replications

• If you can identify appropriate warm-up and run-length times, just make
replications as for terminating simulations
• Only difference: Specify Warm-Up Period in Simulate module
• Proceed with confidence intervals, comparisons, all statistical analysis as in
terminating case
Determining Important Experimentation Factors

• Data Analysis
• Expert Knowledge
• Preliminary Experimentation

when factors are changed in isolation, they may have a very different effect from
when they are changed in combination (Ceteris Paribus not Applicable)
Agenda

• Orient to Model Usage


• Experimentation Principles
• Experimentation Analysis: Sensitivity vs Scenario
• Experimentation in Discrete Event Modeling
• Experimentation in Continuous Event Modeling
Behavior and Behavior Over Time Graphs
• How SD analyse the results of the model

Behavior Behavior Overtime (BOT) Graphs


Behavior (American) / Behaviour (English): • So analysis is based on Behavior Over Time
representing as Line Graph with time as the
• to the actions and mannerisms made by X line
organisms, systems, or artificial entities in
conjunction with its environment, which • It works both ways
includes the other systems or organisms • Drafting BOT Graphs we could deduce the structure
around as well as the physical environment. It of the systems
is the response of the system or organism to
various stimuli or inputs, whether internal or
external, conscious or subconscious, overt or Event
Event
covert, and voluntary or involuntary. Event

• How can your tell someone behaviors?


• observation • Changing the Structure should change behavior and
• Look at “Raport” or Past history responds to events
• behavior test: working together

The key of behavior is: Over-Time a collection Event


events in longer time period Event
Event

31
Common modes of behavior in dynamic systems

Figure 4-1

32
We can deduce the structure based on their BOT

33
And in the real life.. these BOT patterns exist
• Exponential Growth

Figure 4-3
Sources: Bureau of Economic Analysis; Kurian (1994); US Census Bureau; Joglekar (1996). 34
And in the real life.. these BOT patterns exist
• Goal-seeking behavior: examples

Figure 4-5
Source: 1966-1971: Defects: Sterman, Repenning, and Kofman (1997); Load Factor: Annual report of Tedllisudden Voima Oy
(TVO), Finland, 1994: Adverstising: Kurian (1994); Fatalities: Historical Statistics of the US, Statistical Abstract of the US. 35
Another Examples

36
With again real-life counter parts
• Oscillations

Figure 4-7 Oscillation: examples


Source: Historical Statistics of the United States, US Bureau of Economic Analysis. 37
With again real-life counter parts
• S-Shaped Growth

US Cable Television Subscribers


100

75
% of Hou seh olds with TV
Subscribing to Cable

50

25

Cable Subscribers
(Million Hou seh olds)
0
1950 1960 1970 1980 1990 2000

Figure 4-9
Source: Sunflowers: Lotka (1956, p.74); Cable TV: Kurian (1994), Statistical Abstract of the US: Pacemaker
adoption: Homer (1983, 1987). 38
Another Examples (2)

39
Real-life counter parts Data
• Overshoot and Oscillate

Figure 4-11
Source: London Population: 1800-1960, Mitchell (1975); Aluminum Production: USGS;
http://minerals.er.usgs.gov/minerals/pubs/commodity/

40
real-life counter parts
• Overshoot and Collapse

Figure 4-13
Source: Haddock: 1887-1950, Historical Statistics of the United States; National Marine Fisheries Service. Nuclear Capacity:
Brown, Flavin, and Kane (1992). Atari: Paich and Sterman (1993). Silver Prices: Cash Price, Datastream database.
41
Estimated
population and
tree cover of
Easter Island
Figure 4-14 Note: Time axes
for top and bottom graphs
differ.

Source: Bahn and Flenley (1992, p.


174).

42
Other Modes of Behavior

• Statis or Equilibrium
• Change to slow relative to time horizon
• Powerful negative feedback that maintains order
• Randomness
• Comes from our limited knowledge of the system (we don’t know what causing
variations)
• Chaos
• Oscillations, without single pattern image

43

You might also like