PW07 Model Experimentation Analysis
PW07 Model Experimentation Analysis
Ruang L301-302
Gedung Laboratorium
Departemen Teknik Industri
Fakultas Teknik
Universitas Indonesia
systems.ie.ui.ac.id
Agenda
• Problem Formulation
• Goal Setting
• Model Conceptualization
• Data Collection
• Model Translation
• Verification and Validation
• Experimental Design
• Production Runs and Analysis
• Documentation/Reporting
• Implementation
12
Modeling Processes (Sterman 2000)
• Results of any step can yield insights that lead to revisions in any earlier step (indicated by
the links in the center of the diagram).
1. Problem Articulation
(Boundary Selection)
The modeling
5. Policy process is
2. Dynamic iterative.
Formulation
Hypothesis
& Evaluation
4. Testing 3. Formulation
Orient your analysis to the 2 types of model Usage
“Exploratory Modeling and the Use of Simulation for Policy Analysis” — Bankes(1993)
Approaches Comparison on Exploratory and Explanatory
Exploratory Explanatory
Rather a set of inter-related, not A model that is able to relate with the
Output necessarily tightly-coupled, "semi- reality although cannot be claimed as
finished" simpler exploratory model absolute reality.
Purposes of Simulation Revisited
• To explore alternatives
• To improve the quality of decision making
• To enable more effective planning
• To improve understanding of the business
• To enable faster decision making
• To provide more timely information
• To enable more accurate forecasts
• To generate cost savings
Repetition vs Replication
• Be careful of different meanings at different context
• Random input leads to random output • How many runs should I perform? -
NEVER just once or twice
(RIRO) • Computing power
• Run a simulation (once) — what does it • Confidence of the Results (variability of the
model)
mean? • Trial and Error
• Was this run “typical” or not?
• Creating Scenarios based on Problems
• Variability from run to run (of the same Questions (How many?) – usually 3
model)? • Pessimism
• Optimal
• Need statistical analysis of output data • Ideal
• Time frame of simulations • Be familiar with the reports capability
• Terminating: Specific starting, stopping of each simulation applications
conditions • Statistical Module (Output and
Graphs)
• Steady-state: Long-run (technically
forever) • Ability to export to different type of
files (txt, xls, etc) for further statistical
• Here: Terminating analysis
• Output Analyzer
• Always return to problem articulation
and the Dynamic Hypothesis for • Be careful with information overload,
Experimentation focus on answering problem
questions
Agenda
• Concepts of Leverage
• At which points in space we can use a rod with a rabbit at on end to move the earth
at the other end?
Leverage Points
Sensitivity Analysis
• Numerical Sensitivity
• A change in assumptions change the numerical values of results
• Behavior Mode Sensitivity
• Change the patterns of behvaior generated by model
• Policy Sensitivity
• Reverse the impacts or desirability of a proposed policy
Agenda
What is Scenario?
• Sequence of alternative images of • Scenarios are not about predicting the
how the future may unfold, created future, rather they are about
from mental or formal models, which perceiving the futures in present
reflects the integration of past, present
and future developments (Rotmans et • They are not Prophecies, and they are
al.2000) not Predictions based on “Gut Feelings”
When human ask an ultra intelligent computer: After long bleeps and blinks, the computer answer
“Do you compute that you will “That reminds me of a story”
ever think like a human being?”
Steps of Scenarios Development
Not
Knowing Knowing
what we what we Better Worse
don’t know do not
know
Knowing
Not Chang
Knowing Same
what we
what we e
know
know
Things to Consider in Developing Scenarios
• The Purpose of Modeling: What it would be used for, from User needs
• Defining the Key Variables : from output to input and the variable variations
• The Capabilities of the Software that you Used
• The Capabilities of the Hardware
• Resources Limitation: Time, Expertise, Money etc.
Agenda
Most models start empty and idle Remedies for initialization bias
• Better starting state, more typical of steady
• Empty: No entities present at time state
0 • Throw some entities around the model
• Idle: All resources idle at time 0 • Can be inconvenient to do this in the
model
• In a terminating simulation this is • How do you know how many to throw
and where? (This is what you’re trying
OK if realistic to estimate in the first place.)
• In a steady-state simulation, • Make the run so long that bias is
overwhelmed
though, this can bias the output for • Might work if initial bias is weak or
a while after startup dissipates quickly
• Bias can go either way • Let model warm up, still starting empty and
• Usually downward (results are idle
biased low) in queueing-type • Simulate module: Warm-Up Period
models that eventually get (time units!)
congested • “Clears” all statistics at that point for
summary report, any cross-replication
• Depending on model, parameters, data saved with Statistics module’s
and run length, the bias can be Outputs area (but not Time-Persistent or
Tallies)
very severe
Warm Up and Run Length (cont’d.)
• No explosions
• All seem to be settling
into steady state
• Run length seems
adequate to reach
steady state
• Hard to judge warm-up
...
Warm Up and Run Length (cont’d.)
• If you can identify appropriate warm-up and run-length times, just make
replications as for terminating simulations
• Only difference: Specify Warm-Up Period in Simulate module
• Proceed with confidence intervals, comparisons, all statistical analysis as in
terminating case
Determining Important Experimentation Factors
• Data Analysis
• Expert Knowledge
• Preliminary Experimentation
when factors are changed in isolation, they may have a very different effect from
when they are changed in combination (Ceteris Paribus not Applicable)
Agenda
31
Common modes of behavior in dynamic systems
Figure 4-1
32
We can deduce the structure based on their BOT
33
And in the real life.. these BOT patterns exist
• Exponential Growth
Figure 4-3
Sources: Bureau of Economic Analysis; Kurian (1994); US Census Bureau; Joglekar (1996). 34
And in the real life.. these BOT patterns exist
• Goal-seeking behavior: examples
Figure 4-5
Source: 1966-1971: Defects: Sterman, Repenning, and Kofman (1997); Load Factor: Annual report of Tedllisudden Voima Oy
(TVO), Finland, 1994: Adverstising: Kurian (1994); Fatalities: Historical Statistics of the US, Statistical Abstract of the US. 35
Another Examples
36
With again real-life counter parts
• Oscillations
75
% of Hou seh olds with TV
Subscribing to Cable
50
25
Cable Subscribers
(Million Hou seh olds)
0
1950 1960 1970 1980 1990 2000
Figure 4-9
Source: Sunflowers: Lotka (1956, p.74); Cable TV: Kurian (1994), Statistical Abstract of the US: Pacemaker
adoption: Homer (1983, 1987). 38
Another Examples (2)
39
Real-life counter parts Data
• Overshoot and Oscillate
Figure 4-11
Source: London Population: 1800-1960, Mitchell (1975); Aluminum Production: USGS;
http://minerals.er.usgs.gov/minerals/pubs/commodity/
40
real-life counter parts
• Overshoot and Collapse
Figure 4-13
Source: Haddock: 1887-1950, Historical Statistics of the United States; National Marine Fisheries Service. Nuclear Capacity:
Brown, Flavin, and Kane (1992). Atari: Paich and Sterman (1993). Silver Prices: Cash Price, Datastream database.
41
Estimated
population and
tree cover of
Easter Island
Figure 4-14 Note: Time axes
for top and bottom graphs
differ.
42
Other Modes of Behavior
• Statis or Equilibrium
• Change to slow relative to time horizon
• Powerful negative feedback that maintains order
• Randomness
• Comes from our limited knowledge of the system (we don’t know what causing
variations)
• Chaos
• Oscillations, without single pattern image
43