0% found this document useful (0 votes)
27 views39 pages

Chapter 5

Uploaded by

ARISHA 124
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views39 pages

Chapter 5

Uploaded by

ARISHA 124
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 39

Chapter 5

Statistical Models in
Simulation

Banks, Carson, Nelson & Nicol


Discrete-Event System Simulation
Purpose & Overview
 The world the model-builder sees is probabilistic
rather than deterministic.
 Some statistical model might well describe the variations.

 An appropriate model can be developed by sampling


the phenomenon of interest:
 Select a known distribution through educated guesses
 Make estimate of the parameter(s)
 Test for goodness of fit

 In this chapter:
 Review several important probability distributions
 Present some typical application of these models

2
Review of Terminology and Concepts
 In this section, we will review the following
concepts:
 Discreterandom variables
 Continuous random variables
 Cumulative distribution function
 Expectation

3
Discrete Random Variables [Probability Review]

 X is a discrete random variable if the number of


possible values of X is finite, or countably infinite.
 Example: Consider jobs arriving at a job shop.
 Let X be the number of jobs arriving each week at a job shop.
 Rx = possible values of X (range space of X) = {0,1,2,…}
 p(xi) = probability the random variable is xi = P(X = xi)
 p(xi), i = 1,2, … must satisfy:
1. p ( xi )  0, for all i



2. i 1
p ( xi )  1

 The collection of pairs [xi, p(xi)], i = 1,2,…, is called the


probability distribution of X, and p(xi) is called the probability
mass function (pmf) of X.
4
Example 2

X 1 2 3 4 5 6
P(x) 1/21 2/21 3/21 4/21 5/21 6/21

5
Continuous Random Variables [Probability Review]

 X is a continuous random variable if its range space Rx is an


interval or a collection of intervals.
 The probability that X lies in the interval [a,b] is given by:
b
P(a  X  b)   f ( x)dx
a

 f(x), denoted as the pdf of X, satisfies:


1. f ( x)  0 , for all x in R X
2.  f ( x)dx  1
RX

3. f ( x)  0, if x is not in RX

 Properties x0
1. P( X  x0 )  0, because  f ( x)dx  0
x0

2. P(a  X  b)  P(a  X  b)  P(a  X b)  P(a  X b)


6
Continuous Random Variables [Probability Review]

 Example: Life of device used to inspect cracks in


aircraft wing is given by X, a continuous random
variable assuming all values in the range x>=0 with
pdf in year is :
1 x / 2
 e , x0
f ( x)   2
0, otherwise

 X has an exponential distribution with mean 2 years


 Probability that the device’s
1 3 life is between 2 and 3 years is:
P ( 2  x  3)   e x/2
dx  0.14
2 2

7
Cumulative Distribution Function [Probability Review]
 Cumulative Distribution Function (cdf) is denoted by F(x), where F(x)
= P(X <= x)
 If X is discrete, then F ( x)   p ( xi )
all
xi  x
 If X is continuous, then x
F ( x)   f (t )dt


 Properties of cdf
1. F is nondecreasing function. If a b, then F (a )  F (b)
2. lim x  F ( x)  1
3. lim x  F ( x)  0

 All probability question about X can be answered in terms of the cdf,


e.g.:
P (a  X  b)  F (b)  F (a ), for all a b

8
Example 5.4

X (-∞,1) [1,2) [2,3) [3,4) [4,5) [5,6) [6,∞)

F(x) 0 1/21 3/21 6/21 10/21 15/21 21/21

9
Cumulative Distribution Function[Probability Review]

 Example: An inspection device has cdf:


1 x t / 2
F ( x)   e dt  1  e  x / 2
2 0

 The probability that the device lasts for less than 2 years:
P (0  X  2)  F (2)  F (0)  F (2)  1  e 1  0.632

 The probability that it lasts between 2 and 3 years:

P(2  X  3)  F (3)  F (2)  (1  e  (3 / 2 ) )  (1  e 1 )  0.145

10
Expectation [Probability Review]

 The expected value of X is denoted by E(X)


 If X is discrete E ( x)   xi p( xi )
all i

 If X is continuous E ( x)   xf ( x)dx


 a.k.a the mean, m, or the 1st moment of X


 A measure of the central tendency
 The variance of X is denoted by V(X) or var(X) or 2
 Definition: V(X) = E[(X – E[X]2]
 Also, V(X) = E(X2) – [E(x)]2
 A measure of the spread or variation of the possible values of X
around the mean
 The standard deviation of X is denoted by 
 Definition: square root of V(X)
 Expressed in the same units as the mean

11
Expectations [Probability Review]

 Example: The mean of life of the previous inspection


device is: 
1  x / 2 x / 2
E ( X )   xe dx   xe

  e  x / 2 dx  2
2 0 0
0

 To compute variance of X, we first compute E(X2):



1 x / 2
E ( X )   x e dx   x e
 
2 2 x / 2 2   e  x / 2 dx  8
2 0 0
0

 Hence, the variance and standard deviation of the device’s


life are:
V (X )  8  2  4
2

  V (X )  2
12
Useful Statistical Models
 In this section, statistical models appropriate
to some application areas are presented.
The areas include:
 Queuing systems
 Inventory and supply-chain systems
 Reliability and maintainability
 Limited data

13
14
15
16
Other areas [Useful Models]

17
Discrete Distributions
 Discrete random variables are used to
describe random phenomena in which only
integer values can occur.
 In this section, we will learn about:
 Bernoullitrials and Bernoulli distribution
 Binomial distribution
 Geometric and negative binomial distribution
 Poisson distribution

18
Bernoulli Trials
and Bernoulli Distribution [Discrete Dist’n]
 Bernoulli Trials:
 Consider an experiment consisting of n trials, each can be a
success or a failure.
 Let Xj = 1 if the jth experiment is a success
 and Xj = 0 if the jth experiment is a failure
 The Bernoulli distribution (one trial):
 p, x j  1, j  1,2,..., n

p j ( x j )  p ( x j )  1  p  q, x j  0 ,j  1,2 ,...,n
0, otherwise

 where E(Xj) = p and V(Xj) = p (1-p) = p q
 Bernoulli process:
 The n Bernoulli trials where trails are independent:
p(x1,x2,…, xn) = p1(x1) p2(x2) … pn(xn)
19
Binomial Distribution [Discrete Dist’n]

 The number of successes in n Bernoulli trials, X, has


a binomial distribution.
 n  x n  x
  p q , x  0,1,2,..., n
p ( x)   x 
0, otherwise

The number of
Probability that
outcomes having
there are
the required number
x successes and
of successes and
(n-x) failures
failures

 The mean, E(x) = p + p + … + p = n*p


 The variance, V(X) = pq + pq + … + pq = n*pq

20
Geometric & Negative
Binomial Distribution [Discrete Dist’n]

 Geometric distribution
 The number of Bernoulli trials, X, to achieve the 1st success:
 q x 1 p, x  0,1,2,..., n
p( x)  
0, otherwise
 E(x) = 1/p, and V(X) = q/p2

 Negative binomial distribution


 The number of Bernoulli trials, X, until the kth success
 If Y is a negative binomial distribution with parameters p
and k, then:  y  1 y  k k
  q p , y  k , k  1, k  2,...
p ( x)   k  1
0, otherwise

 E(Y) = k/p, and V(X) = kq/p2

21
Poisson Distribution [Discrete Dist’n]

 That expresses the probability of a given number of events occurring in a fixed interval of time or
space if these events occur with a known constant rate and independently of the time since the
last event.
 An event can occur 0, 1, 2, … times in an interval. The average number of events in an interval is
designated λ (lambda). Lambda is the event rate, also called the rate parameter. The probability of
observing k events in an interval is given by the equation

x
e   i
F ( x)  
 E(X) =  = iV(X)
0 i!
22
Poisson Distribution [Discrete Dist’n]

 Example: A computer repair person is “beeped”


each time there is a call for service. The number of
beeps per hour ~ Poisson( = 2 per hour).

 The probability of three beeps in the next hour:


p(3) = e-223/3! = 0.18
also, p(3) = F(3) – F(2) = 0.857-0.677=0.18

 The probability of two or more beeps in a 1-hour period:


p(2 or more) = 1 – p(0) – p(1)
= 1 – F(1)
= 0.594

23
Continuous Distributions
 Continuous random variables can be used to
describe random phenomena in which the
variable can take on any value in some
interval.
 In this section, the distributions studied are:
 Uniform
 Exponential
 Normal
 Weibull
 Lognormal

24
Uniform Distribution [Continuous Dist’n]

 A random variable X is uniformly distributed on the


interval (a,b), U(a,b), if its pdf and cdf are:
 1 0, x a
 , a xb x a
f ( x)   b  a F ( x)   , a  x b
0, otherwise b  a
1, xb

 Properties
 P(x1 < X < x2) is proportional to the length of the interval
[F(x2) – F(x1) = (x2-x1)/(b-a)]
 E(X) = (a+b)/2 V(X) = (b-a)2/12
 U(0,1) provides the means to generate random
numbers, from which random variates can be
generated.
25
Exponential Distribution [Continuous Dist’n]

 A random variable X is exponentially distributed with


parameter  > 0 if its pdf and cdf are:
e  x , x  0 0, x 0
f ( x)   F ( x )   x  t
0 e dt  1  e , x  0
 x
0, elsewhere

 E(X) = 1/V(X) = 1/2


 Used to model interarrival times
when arrivals are completely
random, and to model service
times that are highly variable
 For several different exponential
pdf’s (see figure), the value of
intercept on the vertical axis is ,
and all pdf’s eventually intersect.
26
Exponential Distribution [Continuous Dist’n]

 Memoryless property
 For all s and t greater or equal to 0:
P(X > s+t | X > s) = P(X > t)

 Example: A lamp ~ exp( = 1/3 per hour), hence,


on average, 1 failure per 3 hours.
 The probability that the lamp lasts longer than its mean
life is: P(X > 3) = 1-(1-e-3/3) = e-1 = 0.368
 The probability that the lamp lasts between 2 to 3 hours
is:
P(2 <= X <= 3) = F(3) – F(2) = 0.145
 The probability that it lasts for another hour given it is
operating for 2.5 hours:
P(X > 3.5 | X > 2.5) = P(X > 1) = e-1/3 = 0.717

27
Normal Distribution [Continuous Dist’n]

 A random variable X is normally distributed has the pdf:


1  1  x   2 
f ( x)  exp      ,    x 
 2 
 2    

 Mean:     
 Variance:  2  0
 Denoted as X ~ N(,2)
 Special properties:
 lim x f ( x)  0, and lim x f ( x)  0.
 f(-x)=f(+x); the pdf is symmetric about .
 The maximum value of the pdf occurs at x = ; the mean and
mode are equal.

28
Normal Distribution [Continuous Dist’n]

 Evaluating the distribution:


 Use numerical methods (no closed form)
 Independent of  and  using the standard normal
distribution:
Z ~ N(0,1)
 Transformation of variables: let Z = (X - ) / ,

 x 
F ( x )  P  X  x   P Z  
  
1 z2 / 2 z 1 t 2 / 2

( x ) /
e dz , where ( z )   e dt

2

2
( x ) /
  ( z )dz   ( x  )

29
Normal Distribution [Continuous Dist’n]

 Example: The time required to load an oceangoing


vessel, X, is distributed as N(12,4)
 The probability that the vessel is loaded in less than 10
hours:  10  12 
F (10)      (1)  0.1587
 2 

 Using the symmetry property, (1) is the complement of  (-1)

30
Weibull Distribution [Continuous Dist’n]

 A random variable X has a Weibull distribution if its pdf has the


form:    x    1   x    

f ( x)     exp     , x  
    
    
0, otherwise

 3 parameters:
 Location parameter: (  )
 Scale parameter: 
 Shape parameter. 
 Example:  = 0 and = 1:

When  = 1,
X ~ exp( = 1/)

31
Lognormal Distribution [Continuous Dist’n]

 A random variable X has a lognormal distribution if


its pdf has the form:
 1  ln x  μ  2  =1,
 exp  , x  0
f ( x)   2π σx  2σ 2
 2=0.5,1,2.
0, otherwise

2

Mean E(X) = e + /2

2 2

Variance V(X) = e e - 1)
+ /2 ( 

 Relationship with normal distribution


 When Y ~ N(, 2), then X = eY ~ lognormal(, 2)
 Parameters  and 2 are not the mean and variance of the
lognormal 32
Poisson Distribution
 Definition: N(t) is a counting function that
represents the number of events occurred in [0,t].
 A counting process {N(t), t>=0} is a Poisson
process with mean rate  if:
 Arrivals occur one at a time
 {N(t), t>=0} has stationary increments
 {N(t), t>=0} has independent increments
 Properties
e  t ( t ) n
P[ N (t )  n]  , for t  0 and n  0,1,2,...
n!
 Equal mean and variance: E[N(t)] = V[N(t)] = t
 Stationary increment: The number of arrivals in time s to t
is also Poisson-distributed with mean (t-s)

33
Interarrival Times [Poisson Dist’n]

 Consider the interarrival times of a Poisson process (A1, A2, …),


where Ai is the elapsed time between arrival i and arrival i+1

 The 1st arrival occurs after time t iff there are no arrivals in the
interval [0,t], hence:
P{A1 > t} = P{N(t) = 0} = e-t
P{A1 <= t} = 1 – e-t [cdf of exp()]
 Interarrival times, A1, A2, …, are exponentially distributed and
independent with mean 1/

Arrival counts ~ Interarrival time ~


Poi() Exp(1/)
Stationary & Memoryless
Independent
34
Splitting and Pooling [Poisson Dist’n]

 Splitting:
 Suppose each event of a Poisson process can be classified
as Type I, with probability p and Type II, with probability 1-p.
 N(t) = N1(t) + N2(t), where N1(t) and N2(t) are both Poisson
processes with rates  p and  (1-p)
 

p N1(t) ~ Poi[p]
N(t) ~ Poi() 

(1-p) N2(t) ~ Poi[(1-p)]

 Pooling:
 Suppose two Poisson processes are pooled together
 N1(t) + N2(t) = N(t), where N(t) is a Poisson processes with
rates 1 + 2 N1(t) ~ Poi[ ] 1
 
N(t) ~ Poi(12)
N2(t) ~ Poi[] 2
35
Nonstationary Poisson
Process (NSPP) [Poisson Dist’n]
 Poisson Process without the stationary increments, characterized by
(t), the arrival rate at time t.
 The expected number of arrivals by time t, (t):
t
Λ(t)   λ(s)ds
0

 Relating stationary Poisson process n(t) with rate  and NSPP N(t)
with rate (t):
 Let arrival times of a stationary process with rate  = 1 be t1, t2, …,
and arrival times of a NSPP with rate (t) be T1, T2, …, we know:
ti = (Ti)
Ti = (ti)

36
Nonstationary Poisson
Process (NSPP) [Poisson Dist’n]
 Example: Suppose arrivals to a Post Office have rates 2 per minute
from 8 am until 12 pm, and then 0.5 per minute until 4 pm.
 Let t = 0 correspond to 8 am, NSPP N(t) has rate function:
2, 0  t  4
 (t )  
0.5, 4  t 8
Expected number of arrivals by time t:
2t , 0  t 4
 (t )   4 t t
0 2 ds  4 0 . 5 ds 
2
 6, 4  t 8

 Hence, the probability distribution of the number of arrivals between


11 am and 2 pm.
P[N(6) – N(3) = k] = P[N((6)) – N((3)) = k]
= P[N(9) – N(6) = k]
= e(9-6)(9-6)k/k! = e3(3)k/k!

37
Empirical Distributions [Poisson Dist’n]

 A distribution whose parameters are the observed values


in a sample of data.
 May be used when it is impossible or unnecessary to establish that
a random variable has any particular parametric distribution.
 Advantage: no assumption beyond the observed values in the
sample.
 Disadvantage: sample might not cover the entire range of possible
values.

38
Summary
 The world that the simulation analyst sees is probabilistic,
not deterministic.
 In this chapter:
 Reviewed several important probability distributions.
 Showed applications of the probability distributions in a simulation
context.
 Important task in simulation modeling is the collection and
analysis of input data, e.g., hypothesize a distributional
form for the input data. Reader should know:
 Difference between discrete, continuous, and empirical
distributions.
 Poisson process and its properties.

39

You might also like