0% found this document useful (0 votes)
15 views33 pages

Lecture 25

The document discusses discrete event simulation for manufacturing systems. It covers topics like the meaning and functions of simulation, classification of simulation models, basic definitions used in simulation, and provides an example of simulating an M/M/1 queue using an event-scheduling approach.

Uploaded by

Renato Alves
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views33 pages

Lecture 25

The document discusses discrete event simulation for manufacturing systems. It covers topics like the meaning and functions of simulation, classification of simulation models, basic definitions used in simulation, and provides an example of simulating an M/M/1 queue using an event-scheduling approach.

Uploaded by

Renato Alves
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

SMA6304 M2 ---Factory Planning and scheduling

Lecture Discrete Event Simulation


of Manufacturing Systems

Sivakumar AI
Lecture: 12

copyright © 2002 Sivakumar 1

Simulation

Simulation - A Predictive Tool

• Next job • Schedules


• Next Task • Plans

Data
Manufacturing

Simulation - A decision support


Scheduling
System
Engine SIMULATOR

copyright © 2002 Sivakumar 2

1
Simulation
Lecture (12 December 2002)

� 1. Introduction
� 2. Input probability distributions
� 3. Generating Random Numbers & Random Variates
� 4. Verification and Validation
� 5. Output data analysis
� 6. Simulation life cycle

References
� Law, A. M., Kelton, W. D. Simulation Modeling and Analysis, McGr a w -Hill

� Banks, J. Handbook of Simulation, EMP

copyright © 2002 Sivakumar 3

1. Introduction

copyright © 2002 Sivakumar 4

2
The Meaning of Simulation Oxford English Dictionary

�Simulation :
– The technique of imitating the behavior of some
situation or system (Manufacturing, etc) by means
of an analogous situation, model or apparatus,
either to gain information more conveniently or…..
�Simulator :
– An apparatus or system for reproducing the behavior of
some situation or system; ….., and gives the illusion ....
of behaving like the real thing.

copyright © 2002 Sivakumar 5

System Modeling
Model:
� A simplified or idealized description of a system,
situation, or process, often in mathematical
terms, devised to facilitate calculations and
predictions
� a representation of an object, system or idea in a
form other than that of the entity / system itself.
� an abstraction and simplification of the real
world.

copyright © 2002 Sivakumar 6

3
System Modeling - Functions of models

� As an analytical tool
� Analyze manufacturing systems

� Evaluating equipment requirements

� Design transportation facility

� Ordering policy for an inventory system

� As an aid for experimentation


� For planning and scheduling
� as an aid to thought
� as an aid to communicating
� for education and training
copyright © 2002 Sivakumar 7

Classification of models
� Physical models
– analog models of continuous systems e.g. traffic flow.
– iconic models e.g. pilot training simulators.

� Analytical/Mathematical model Most scheduling systems

– Representing a system in terms of quantitative relationships.

� Static simulation models


– Time does not play a role; e.g. Monte Carlo simulation.

� Conventional simulation models


– System as it evolves over time - therefore it is dynamic
– Have I/O and internal structure; run rather than solved.
– Empirical
– Stochastic, (can be deterministic for scheduling application).

� Online simulation models


– As conventional; but near -real-time; useful for decision support

copyright © 2002 Sivakumar 8

4
Classification of simulation models
� Deterministic vs. Stochastic Simulation models
– If no probabilistic components then it is deterministic
– If random input components used then it is Stochastic.

� Continuous vs. Discrete-event Simulation models


– Discrete-event simulation concerns modeling a system as it
evolves over time by a representation where state variables chan ge
instantaneously at the event
– Continuous simulation covers modeling over time by a
representation where state variables change continuously with
respect to time (e.g. using differential equations)

� Combined Discrete-Continuous Simulation


– For systems that are neither completely discrete nor completely
continuos e.g arrival of a tanker and filling it.

copyright © 2002 Sivakumar 9

Simulation models
Manual model generation Auto model generation

Dynamic
Conventional
(near-real-time)
Simulation
Simulation
Model
Model
•Most suitable for planning and
•For analytical work
scheduling
•For decisions or confirmation
of decisions
•Uses near-real -time data
•Rapid model building •Fully automatic
•Manually intensive •Integrated with info. systems
•Hard to maintain •No direct maintenance

copyright © 2002 Sivakumar 10

5
Some basic definitions
� System state variables
– Collection of information needed to define what is happening in a system to a
sufficient level at given point in time

� Events
– Exogenous e.g. order arrival; Endogenous e.g a machine down

� Entities and attributes


– Dynamic entity e.g. a customer, Static entity e.g. a machine.
– An entity is defined by its attributes, e.g. quantity of a lot

� Resources
– Resource is a static entity that provides service to dynamic entity (a lot)

� Activity and delay


– Activity is a period whose duration is known; Delay is an indefinite duration
caused by a combination of systems conditions

copyright © 2002 Sivakumar 11

Four types of modeling structures


� Event-Scheduling method
– Events are scheduled by advancing the simulation clock exactly to the time of
the next event. This is one of the most accurate structures .

� Activity Scanning method


� Three-Phase method

� Process interaction method

copyright © 2002 Sivakumar 12

6
Example: M/M/1 queue

Server
i
(machine)

An arriving Job being


A completed job
job (customer) Jobs (customer) processed
(a departing
(IID random in queue) (customer
customer)
arrival) in service)

copyright © 2002 Sivakumar 13

Next-event time advance mechanism in Event-Scheduling


method of an M/M/1 queue

ti = Time of arrival of ith customer


Ai = ti - ti-1 = Inter arrival time = IID random variables
Si = Service time of ith customer= IID ran. variable
Di = Observed delay of ith customer in queue
ci = ti + Di + Si = Completion time of ith customer
ei = Time of occurrence of ith event of any type
Bt (or ) = “Busy function” defined as 1 if server is
dt
busy at time t and 0 if server is idle at time t
copyright © 2002 Sivakumar 14

7
Next-event time advance mechanism in Event-Scheduling
method of an M/M/1 queue

•Inter arrival times A i , and service times Si


have cumulative distribution functions Fa and
Fs, which are determined by collections of
actual past data and fitting distributions. (2. Input
probability distributions)

•Each value of tiis computed using


generated values of Ai , using random
observations from a specific distribution . (3.
Generating Random Numbers and Random Variates)

copyright © 2002 Sivakumar 15

Next-event time advance mechanism in Event-


Scheduling method of an M/M/1 queue

Real time clock


Simulation clock

e0 e1 e2 e3 e4 e5

0 t1 t2 c1 t3 c2
A1 A2 A3
S1 S2
copyright © 2002 Sivakumar 16

8
Next-event time advance mechanism in Event-Scheduling
method of an M/M/1 queue

•The simulation clock is advanced from each event


to the next event based on the event times of the
event list.

•A machine starting to process is an activity and


the end time is known from Si & Fa

•When a customer (or a job) arrives and if the


server is busy then it joins a queue and the end
time is unknown. This is a delay
copyright © 2002 Sivakumar 17

Discrete event simulation of an M/M/1 queue


Job Arrival Service Start End Delay
ti Si ci Di
J1 0.4 2.0 0.4 2.4 0
J2 1.6 0.7 2.4 3.1 0.8
J3 2.1 0.2 3.1 3.3 1.0
J4 3.8 1.1 3.8 4.9 0
J5 4.0 3.7 4.9 8.6 0.9
J6 5.6 2.8
J7 5.8 1.6
J8 7.2 3.1

copyright © 2002 Sivakumar 18

9
Discrete event simulation of an M/M/1 queue
B(t)
1
0
Q(t)
3

2
1

0
Arrivals e2=1.6 e8=4.0 e11=5.8 Time
e1=0.4 e3=2.1 e7=3.8 e10=5.6 e12=7.2
e4=2.4 e6=3.3 e13=8.6
Departures e5=3.1 e5=4.9
copyright © 2002 Sivakumar 19

Expected average delay in M/M/1 queue

From a single run of simulation of n jobs


(customers), a point estimate for

D (n)
, expected average delay in queue of
the n jobs (customers) is
n

� Di
D (n ) = i = 1
n
copyright © 2002 Sivakumar 20

10
Expected average number of jobs in M/M/1 queue

Ti = Total time during the simulation in which the queue


of customers(jobs) is observed as length i

Tn = Time to observe n delays in queue

Q(t) = Number of customers in queue at time t

qn = Average number of customers(jobs) in queue during


n observations
¥

� iT i
q (n ) = i = 1

T n
copyright © 2002 Sivakumar 21

Expected average number of jobs in M/M/1 queue

Since
¥ T (n)

� iT i = � Q ( t ) dt
i=1 0
We get
T (n )
� Q ( t ) dt
q(n ) = 0
Tn
copyright © 2002 Sivakumar 22

11
Expected utilization of the machine in M/M/1 system

u n = Expected proportion of time the server


(machine) is busy during n observations

Since Bt is always either 0 or 1

T (n )
� B (t ) dt
u (n) = 0
Tn
copyright © 2002 Sivakumar 23

Simple output from discrete event simulation

Let take a case of the earlier table of arrival and service


with the first 5 observation of completion (Tn =8.6)
¥
� i T i = (0 X 3.2) + (1 X 2.3 ) + (2 X 1.7) + (3 X 1.4) = 9.9
i = 1

Ti = 0 for i >= 4
and therefore q ( n ) = 9.9/8.6 = 1.15
and
u ( n ) = [(3.3 -0.4) + (8.6-3.8) ]/8.6 = 0.90
copyright © 2002 Sivakumar 24

12
Discrete event simulation
� These values are simple illustrations of
statistics of discrete event simulation
� Discrete-time statistics (e.g. average delay in queue) or
� Continuous -time statistics (e.g. proportion of server busy time)

� A very large number of other useful statistics


could be obtained from each simulation run.
� In addition very complex manufacturing
systems could be modeled using this simple
approach.

copyright © 2002 Sivakumar 25

Discrete event simulation

� However model MUST be


verified and validated to add
credibility to the results. (4.
verification and validation)

� Experimental runs should then be


carried out using the validated
model.
copyright © 2002 Sivakumar 26

13
Discrete event simulation
� However values of each experimental
run are based on “sample” size of 1 (one
complete simulation run) and sample
size of 1 is not statistically useful.
� Multiple replications and confidence
interval are therefore essential elements
of simulation output data analysis. (5.
Output data analysis)

copyright © 2002 Sivakumar 27

Probability & statistics and Simulation


� Probability and statistics are integral part of simulation
study
� Need to understand how to model a probabilistic
system
� Validate a simulation model
� Input probability distributions,
� Generate / use random samples from these
distributions
� Perform statistical analysis of output data
� Design the simulation experiments.

copyright © 2002 Sivakumar 28

14
2. Input probability
distributions

copyright © 2002 Sivakumar 29

Randomness in Manufacturing

� Process time
� MTTR
� MTTF
� Inter arrival time
� Job types or part mix
� Yield
� Rework

� Transport time
� Setup time
� and so on

copyright © 2002 Sivakumar 30

15
Using past data

� Use past data directly in simulation. This is trace


driven simulation. (Effective for model validation)

� Use the sample data to define an empirical


distribution function and sample the required
input data from this distribution.
� Use a standard technique (e.g. regression) to fit a
theoretical distribution form to the sample data
(e.g. exponential etc.), and sample the required
data from it.

copyright © 2002 Sivakumar 31

Steps in selecting Input probability distributions

� Assess sample independence:


– Must confirm the observations X1,X2,…Xn are independent using techniques such as correlation
plot.

� Hypothesizing families of distributions:


– Without concern for specific parameters, we must select general family e.g. normal, exponential etc.

� Estimation of parameters:
– Use the past numerical data to estimate parameters.

� Determine the best fit:


– Use a technique such as probability plot or chi -square test and identify the most suitable distribution
function

� The last three steps are integral part of available


software and therefore we may not have to
manually carryout these steps
copyright © 2002 Sivakumar 32

16
3. Generating Random
Numbers and Random
Variates

copyright © 2002 Sivakumar 33

Status
� Early simulation studies required
random number generation and
generation of random variates from the
distributions, often manually coded in
computers.

� Most of the current simulation languages


and simulators have built in features for
this.
copyright © 2002 Sivakumar 34

17
Random number generation for simulation

� Built in feature should have the following:

� Generate random numbers, uniformly distributed on


U[0,1] that do not exhibit any correlation with each
other.

� Reproduce a given stream of random numbers exactly


(i.e. Identical random numbers) for verification etc.
� Have ability to generate large number of streams for
multiple replications (i.e. Different streams are separate and
independent generators.)

copyright © 2002 Sivakumar 35

Random variate generation for simulation

� Built in feature should also have the following:

� Generate random variates.

� This means: Produce observations for each


selected variable (e.g. MTTR) from the
parameters of the desired distribution
function (e.g. Gamma) using the IID U(0,1)
random numbers with computational
efficiency.
copyright © 2002 Sivakumar 36

18
4. Verification and Validation

copyright © 2002 Sivakumar 37

Definition
� Model verification: Building model right
– Correct translation of conceptual simulation model in to a
working program
– Debugging

� Model validation: Building the right model


– Determine if the conceptual simulation model is an accurate
representation.

� Credible Model: Objectives fulfilled using model


– When the model is accepted by user / client and used for the
purpose it was built

copyright © 2002 Sivakumar 38

19
Verification
� Errors arise from data, conceptual model, computer model,
even computer system.

� Test sub-models first, then complete model.

� Common techniques
– Static: a structured, walk -through technique.
– Dynamic: run program under different conditions, then
check if the output is reasonable.
– Trace: Identify selected state variables (event list) after each
event and check with manual calculations
– Animation: observe animation

copyright © 2002 Sivakumar 39

What is Validation?
� Valid if it’s output behavior is sufficiently accurate to
fulfill the purpose. Absolute accuracy is not essential
and it is too time-consuming and expensive.
– check underlying theories, assumptions, approximation.
– check model structure and logic, math and relationships
(by tracing entities in all sub-models and main model).
– Model should be validated relative to the measures in
the objective.

copyright © 2002 Sivakumar 40

20
Validation
� Data have to be validated:

–difficult, time-consuming and


costly to obtain relevant,
sufficient and most importantly
consistent and accurate factory
data.

copyright © 2002 Sivakumar 41

A three step Validation process

� Conventional simulation studies :

– Step 1. Face validation: ask people


knowledgeable / experienced about the
system under study
– Step 2. Empirically Test & compare with other
models, e.g. analytical models
– Step 3. Detail output data validation
– (a) Confidence Intervals
– (b) Correlated Inspection Approach
copyright © 2002 Sivakumar 42

21
Confidence Intervals
� Let Y1, Y 2, …Y n, be IID random variables with mean and yn
sample variance s(2n)
� It can be shown using central limit theorem that, when n is
“sufficiently large”, that an approximate 100(1- a ) percent
confidence interval ( assuming a t distribution with (n -1) degrees
of freedom) is given by the following:

S2 ( n ) Shaded area = (1-aa )


l( n, a ) = U (n) - t n -1,(1-a / 2 )
n

S2 ( n)
u ( n ,a ) = U ( n) + t n -1,(1- a / 2)
n
- t n - 1,( 1- a / 2) + t n- 1,(1-a / 2)
copyright © 2002 Sivakumar 43

Validation: Confidence Intervals approach

� Assume we collect m independent sets of


data from the system and n independent sets
of data from the model.
� Let Xj be the average of the observations of a
desired variable (e.g. throughput) in the jth set
of system data and U j be the average of the
observations in the jth set of model data of the
same variable.

copyright © 2002 Sivakumar 44

22
Validation: Confidence Intervals approach

� If we assume the m sets of system data are


homogeneous, then Xj’s are IID random
variables with the mean µx.

� If the n sets of simulation model data were


generated using independent replications
then U j’s are IID random variables with the
mean µy.

copyright © 2002 Sivakumar 45

Validation: Confidence Intervals approach

� One of the methods to compare the model


with the system is by constructing a
confidence interval for x where x = mx-my
� Let l( n, a ) and u (n , a ) correspond to lower &
upper confidence interval end points of x .
� If 0 ˇ [l(a ), u (a )] , then the observed difference
between m x and m y is said to be statistically
significant at level a
copyright © 2002 Sivakumar 46

23
Validation: Confidence Intervals approach

� However, even though the difference is


statistically significant, the model may still be
a valid representation of the system to fulfill
the objectives of the simulation.

� On the other hand if 0 ˛ [l(a ), u (a )] , then the


observed difference between m x and m y is
said to be statistically not significant at level a
and may be explained by sampling
fluctuations.
copyright © 2002 Sivakumar 47

Validation : Correlated Inspection Approach

� Statistics of the desired variable from the


systems is compared with corresponding
statistics of the model.
Historical data Historical data
from factory from factory

Actual system Simulation


(i.e., factory) model

Output data Compare Model output


from factory data

copyright © 2002 Sivakumar 48

24
Validation : Correlated Inspection Approach

� Suppose we want to validate the cycle time


(e.g.makespan / duration from arrival to completion).

� We make a number of observation of the


factory cycle time Xj’s and for example, the
inter arrival time of the jobs.

� We then use the observed values of the inter


arrival time of the factory to drive the
simulation model and obtain cycle time
values U j’s from it.
copyright © 2002 Sivakumar 49

Validation : Correlated Inspection Approach

� We compare jth set of system data Xj and

simulation model data U j where -Y


Xjis an
j
estimate for mx-my

� We can look at the sample mean and sample

variance of all m x - mayjudgment


to make
on the validity of the simulation model to
fulfill the objectives.
copyright © 2002 Sivakumar 50

25
5. Output data analysis

copyright © 2002 Sivakumar 51

Transient and steady state behavior


� Experimental observations for output analysis should
be made at steady state, i.e. after a transient state of
the stochastic simulation run.
� Consider the output of the random variable U i for
i=1,2,…m.

� Let F ( y I ) = P ( Y £ ywhere
I ) y is a real
i i
number and I represents initial condition.

The transient distribution F ( yatIdiscrete


) time i

i
is different for each i and each set of I .
copyright © 2002 Sivakumar 52

26
Transient and steady state behavior

� If F ( y I) fi F ( y ) i fi ¥ all y and any


i
Initial condition I then Fto( ybe) steady
is said

state distribution.

SS state =E(Y )

Yi
3

Yi Steady state
2 Transient state (not necessarily normal density)

E(Y i ) Yi
1

i1 i2 i3
copyright © 2002 Sivakumar 53

Random nature of simulation output


� Let U 1, U 2, … U i … U m , be output of stochastic
process from a single simulation run (e.g. U i is
throughput at it h hour, m = number of observations ).
� U i’s are random variables & generally not IID.

� If we carry out n independent replication (using n


different streams) then we realize a set of random
observations as shown below:
y 11 , . . . , y1i , . . . y1m
y 21 , . . . , y2i , . . . y2m
. . . . . . . . . . . .
y n1 , . . . , yni , . . . ynm

copyright © 2002 Sivakumar 54

27
Output analysis
� Observations from a single replication (row) are not
IID.
� However y1i , y 2i ,...yn i from n replications (column) are
IID observations of the random variable, U i for
i=1,2,…m.

� This is the basis of statistical analysis of the

observations of yji . For example an unbiased


n
estimate of E (Y ) � y ji
i y (n ) = j = 1

i n
copyright © 2002 Sivakumar 55

Confidence Interval of Output analysis - an example

� Comparing two systems on a given measure of


performance is done by forming a confidence interval
for the difference in two expected values of for
example and check it for
E (Y )
. 0˛
j
� If the number of observations n1=n2=n then we pair
with Y1j and define
Y2 j =Y
Z jfor j - Y 2and
j=11,2,…n j
j
’s are the IID random variables. Z
Approximate
percent confidence interval100
is:
(1 - a )
S2 (n)
Z(n ) – t n-1,(1- aa / 2)
n
Paired-t confidence interval

copyright © 2002 Sivakumar 56

28
Steps in conventional simulation

Real World System Simulation Model


Abstraction

Experimentation

Implementation

Recommendations Formal Results


Interpretation

copyright © 2002 Sivakumar 57

Credibility Assessment in
Simulation projects

copyright © 2002 Sivakumar 58

29
Credibility Assessment Stages
Quality Control of processes between phases

� Credibility assessment will always be


subjective because
–modeling is an art
–credibility assessment is situation-
dependent
� Accuracy of assessment always relative
to the objective of simulation, never
absolute.
copyright © 2002 Sivakumar 59

Credibility Assessment Stages


Quality Control of processes between phases

� Peer Assessment
– Panel of persons who are
� experts of the system under study
� expert modelers
� simulation analysts
� familiar with simulation projects

copyright © 2002 Sivakumar 60

30
Credibility Assessment Stages
Quality Control of processes between phases

� Verify formulated problem


– to make sure it faithfully reflects the real problem.
� Feasibility of simulation
– is data available? Easy or costly to get?
– are resources for simulation available?
– cost-benefit: any time limit imposed to complete study?
� The real system
– are system’s boundaries well-defined?
– have objectives of simulation changed with time?
– counter-intuitive behavior accounted for?
– any drift to low performance?

copyright © 2002 Sivakumar 61

Credibility Assessment Stages


Quality Control of processes between phases

� Qualifying the conceptual model


– are assumptions explicitly defined, appropriate?

� Verifying the communicative model


– use techniques such as walk -through, structural
analysis, data-flow analysis (Whitner & Balci, 1986).

� Verifying the programmed model


– use standard software verification techniques.

copyright © 2002 Sivakumar 62

31
Credibility Assessment Stages
Quality Control of processes between phases
� Verify experimental design
– is random number generator accurate and true?
– are statistical techniques for design and analysis of
experiments appropriate?
– initial transients accounted for?
– have you ensured identical experimental conditions for each
alternative operating policy?

� Data validation (of model parameters and input data)

– are they appropriate? current? unbiased? inter-dependent?


complete? accurate?
– are instruments for data measurement and collection
accurate?
copyright © 2002 Sivakumar 63

Credibility Assessment Stages


Quality Control of processes between phases

� Validating the Experimental Model


– always compare the behavior of the model and real system
under identical input conditions .
– subjective and statistical validation techniques applicable
only when data are completely observable.

� Interpretation of simulation results


– interpret numerical results based on the objective of the
study. Judgment involved.
� Documentation
– Embed documentation into model development cycle.

copyright © 2002 Sivakumar 64

32
Credibility Assessment Stages
Quality Control of processes between phases

� Presentation
– communicating simulation results
� translate the jargon so non-simulation people
& decision-makers can understand

– presentation techniques
� integrate simulation results with a DSS, so
decision-maker can appreciate the significance
of the simulation results.

copyright © 2002 Sivakumar 65

Wrap-up
� We have looked at the simulation of a M/M/1 queue
� We have discussed Input probability distributions
� We talked about the Random Numbers and
Random Variates
� We discussed Validation techniques
� We outlined output data analysis and confidence
intervals
� Life cycle and Credibility Assessment of
simulation models

copyright © 2002 Sivakumar 66

33

You might also like