0% found this document useful (0 votes)
73 views25 pages

Verification and Validation of Simulation Models

The document discusses verification and validation of simulation models, emphasizing the importance of both building the model correctly through verification and building an accurate representation of the real system through validation. It provides examples of examining model output for reasonableness during verification and comparing model input-output transformations to real system data through hypothesis testing during validation. The goal of validation is to produce a model that closely represents true behavior well enough for decision making.

Uploaded by

dfdfdf
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views25 pages

Verification and Validation of Simulation Models

The document discusses verification and validation of simulation models, emphasizing the importance of both building the model correctly through verification and building an accurate representation of the real system through validation. It provides examples of examining model output for reasonableness during verification and comparing model input-output transformations to real system data through hypothesis testing during validation. The goal of validation is to produce a model that closely represents true behavior well enough for decision making.

Uploaded by

dfdfdf
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 25

Chapter 10

Verification and Validation


of Simulation Models

Banks, Carson, Nelson & Nicol


Discrete-Event System Simulation
Purpose & Overview
 The goal of the validation process is:
 To produce a model that represents true behavior closely
enough for decision-making purposes
 To increase the model’s credibility to an acceptable level
 Validation is an integral part of model development
 Verification – building the model correctly (correctly implemented
with good input and structure)
 Validation – building the correct model (an accurate
representation of the real system)
 Most methods are informal subjective comparisons while
a few are formal statistical procedures

2
Modeling-Building, Verification & Validation

3
Verification
 Purpose: ensure the conceptual model is reflected
accurately in the computerized representation.
 Many common-sense suggestions, for example:
 Have someone else check the model.
 Make a flow diagram that includes each logically possible action
a system can take when an event occurs.
 Closely examine the model output for reasonableness under a
variety of input parameter settings. (Often overlooked!)
 Print the input parameters at the end of the simulation, make
sure they have not been changed inadvertently.

4
Examination of Model Output
for Reasonableness [Verification]

 Example: A model of a complex network of queues


consisting many service centers.
 Response time is the primary interest, however, it is important to
collect and print out many statistics in addition to response
time.
 Two statistics that give a quick indication of model reasonableness are
current contents and total counts, for example:
 If the current content grows in a more or less linear fashion as the
simulation run time increases, it is likely that a queue is unstable
 If the total count for some subsystem is zero, indicates no items entered
that subsystem, a highly suspect occurrence
 If the total and current count are equal to one, can indicate that an entity
has captured a resource but never freed that resource.
 Compute certain long-run measures of performance, e.g. compute the
long-run server utilization and compare to simulation results

5
Other Important Tools [Verification]

 Documentation
 A means of clarifying the logic of a model and verifying
its completeness
 Use of a trace
 A detailed printout of the state of the simulation model
over time.

6
Calibration and Validation
 Validation: the overall process of comparing the model and its
behavior to the real system.
 Calibration: the iterative process of comparing the model to the real
system and making adjustments.

7
Calibration and Validation
 No model is ever a perfect representation of the system
 The modeler must weigh the possible, but not guaranteed,
increase in model accuracy versus the cost of increased validation
effort.
 Three-step approach:
 Build a model that has high face validity.
 Validate model assumptions.
 Compare the model input-output transformations with the real
system’s data.

8
High Face Validity [Calibration & Validation]

 Ensure a high degree of realism: Potential users should be


involved in model construction (from its conceptualization to its
implementation).
 Sensitivity analysis can also be used to check a model’s
face validity.
 Example: In most queueing systems, if the arrival rate of
customers were to increase, it would be expected that server
utilization, queue length and delays would tend to increase.

9
Validate Model Assumptions
[Calibration & Validation]

 General classes of model assumptions:


 Structural assumptions: how the system operates.
 Data assumptions: reliability of data and its statistical analysis.
 Bank example: customer queueing and service facility in a
bank.
 Structural assumptions, e.g., customer waiting in one line versus
many lines, served FCFS versus priority.
 Data assumptions, e.g., interarrival time of customers, service
times for commercial accounts.
 Verify data reliability with bank managers.
 Test correlation and goodness of fit for data (see Chapter 9 for more
details).

10
Validate Model Assumptions
[Calibration & Validation]

 General classes of model assumptions:


 Structural assumptions: how the system operates.
 Data assumptions: reliability of data and its statistical analysis.
 Bank example: customer queueing and service facility in a
bank.
 Structural assumptions, e.g., customer waiting in one line versus
many lines, served FCFS verses priority.
 Data assumptions, e.g., interarrival time of customers, service
times for commercial accounts.
 Verify data reliability with bank managers.
 Test correlation and goodness of fit for data (see Chapter 9 for more
details).

11
Validate Input-Output Transformation
[Calibration & Validation]

 Goal: Validate the model’s ability to predict future behavior


 The only objective test of the model.
 The structure of the model should be accurate enough to make
good predictions for the range of input data sets of interest.
 One possible approach: use historical data that have been
reserved for validation purposes only.
 Criteria: use the main responses of interest.

12
Bank Example [Validate I-O Transformation]

 Example: One drive-in window serviced by one teller, only


one or two transactions are allowed.
 Data collection: 90 customers during 11 am to 1 pm.
 Observed service times {Si, i = 1,2, …, 90}.
 Observed interarrival times {Ai, i = 1,2, …, 90}.
 Data analysis let to the conclusion that:
 Interarrival times: exponentially distributed with rate l = 45
 Service times: N(1.1, 0.22)

13
The Black Box
[Bank Example: Validate I-O Transformation]

 A model was developed in close consultation with bank


management and employees
 Model assumptions were validated
 Resulting model is now viewed as a “black box”:
Model Output Variables, Y
Input Variables
Primary interest:
Possion arrivals Y1 = teller’s utilization
Uncontrolled l = 45/hr: X11, X12, … Y2 = average delay
variables, X Services times, Model Y3 = maximum line length
N(D2, 0.22): X21, X22, … “black box”
f(X,D) = Y Secondary interest:
D1 = 1 (one teller) Y4 = observed arrival rate
Controlled
D2 = 1.1 min Y5 = average service time
Decision
variables, D (mean service time) Y6 = sample std. dev. of
D3 = 1 (one line) service times
Y7 = average length of time

14
Comparison with Real System Data
[Bank Example: Validate I-O Transformation]

 Real system data are necessary for validation.


 System responses should have been collected during the same
time period (from 11am to 1pm on the same Friday.)
 Compare the average delay from the model Y2 with the
actual delay Z2:
 Average delay observed, Z2 = 4.3 minutes, consider this to be the
true mean value m0 = 4.3.
 When the model is run with generated random variates X1n and
X2n, Y2 should be close to Z2.
 Six statistically independent replications of the model, each of 2-
hour duration, are run.

15
Hypothesis Testing
[Bank Example: Validate I-O Transformation]

 Compare the average delay from the model Y2 with the


actual delay Z2 (continued):
 Null hypothesis testing: evaluate whether the simulation and the
real system are the same (w.r.t. output measures):

H 0: E(Y2 )  4.3 minutes


H1: E(Y2 )  4.3 minutes

 If H0 is not rejected, then, there is no reason to consider the


model invalid
 If H0 is rejected, the current version of the model is rejected,
and the modeler needs to improve the model

16
Hypothesis Testing
[Bank Example: Validate I-O Transformation]
 Conduct the t test:
 Chose level of significance (a = 0.5) and sample size (n = 6),
see result in Table 10.2.
 Compute the same mean and sample standard deviation over
the n replications: n

1 n  (Y 2i  Y2 ) 2
Y2   Y2i  2.51 minutes
n i 1
S i 1
n 1
 0.81 minutes

 Compute test statistics:


Y2  m0 2.51  4.3
t0    5.24  tcritical  2.571 (for a 2 - sided test)
S/ n 0.82 / 6

 Hence, reject H0. Conclude that the model is inadequate.


 Check: the assumptions justifying a t test, that the observations
(Y2i) are normally and independently distributed.
17
Hypothesis Testing
[Bank Example: Validate I-O Transformation]

 Similarly, compare the model output with the observed


output for other measures:
Y4  Z4, Y5  Z5, and Y6  Z6

18
Type II Error [Validate I-O Transformation]

 For validation, the power of the test is:


 Probability[ detecting an invalid model ] = 1 – b
 b = P(Type II error) = P(failing to reject H0|H1 is true)
 Consider failure to reject H0 as a strong conclusion, the modeler
would want b to be small.
 Value of b depends on:
 Sample size, n
E (Y )  m
 The true difference, d, between E(Y) and m: d

 In general, the best approach to control b error is:


 Specify the critical difference, d.
 Choose a sample size, n, by making use of the operating
characteristics curve (OC curve).
19
Type I and II Error
[Validate I-O Transformation]

 Type I error (a):


 Error of rejecting a valid model.
 Controlled by specifying a small level of significance a.
 Type II error (b):
 Error of accepting a model as valid when it is invalid.
 Controlled by specifying critical difference and find the n.
 For a fixed sample size n, increasing a will decrease b.

20
Confidence Interval Testing
[Validate I-O Transformation]

 Confidence interval testing: evaluate whether the


simulation and the real system are close enough.
 If Y is the simulation output, and m = E(Y), the confidence
interval (C.I.) for m is:
Y  ta / 2,n1S / n
 Validating the model:
 Suppose the C.I. does not contain m0:
 If the best-case error is > e, model needs to be refined.
 If the worst-case error is  e, accept the model.
 If best-case error is  e, additional replications are necessary.
 Suppose the C.I. contains m0:
 If either the best-case or worst-case error is > e, additional
replications are necessary.
 If the worst-case error is  e, accept the model.

21
Confidence Interval Testing
[Validate I-O Transformation]

 Bank example: m0  4.3, and “close enough” is e = 1


minute of expected customer delay.
 A 95% confidence interval, based on the 6 replications is
[1.65, 3.37] because:
Y  t 0.025,5 S / n
4.3  2.51(0.82 / 6 )

 Falls outside the confidence interval, the best case |3.37 – 4.3| =
0.93 < 1, but the worst case |1.65 – 4.3| = 2.65 > 1, additional
replications are needed to reach a decision.

22
Using Historical Output Data
[Validate I-O Transformation]

 An alternative to generating input data:


 Use the actual historical record.
 Drive the simulation model with the historical record and then
compare model output to system data.
 In the bank example, use the recorded interarrival and service
times for the customers {An, Sn, n = 1,2,…}.
 Procedure and validation process: similar to the
approach used for system generated input data.

23
Using a Turing Test
[Validate I-O Transformation]

 Use in addition to statistical test, or when no statistical


test is readily applicable.
 Utilize persons’ knowledge about the system.
 For example:
 Present 10 system performance reports to a manager of the
system. Five of them are from the real system and the rest are
“fake” reports based on simulation output data.
 If the person identifies a substantial number of the fake reports,
interview the person to get information for model improvement.
 If the person cannot distinguish between fake and real reports
with consistency, conclude that the test gives no evidence of
model inadequacy.

24
Summary
 Model validation is essential:
 Model verification
 Calibration and validation
 Conceptual validation
 Best to compare system data to model data, and make
comparison using a wide variety of techniques.
 Some techniques that we covered (in increasing cost-to-
value ratios):
 Insure high face validity by consulting knowledgeable persons.
 Conduct simple statistical tests on assumed distributional forms.
 Conduct a Turing test.
 Compare model output to system output by statistical tests.

25

You might also like