0% found this document useful (0 votes)
11 views56 pages

Chapter 5 Sampling in Discrete Even Simulation

This document provides an overview of sampling and random number generation techniques. It discusses the need for sampling and defines key concepts like random variables, probability distributions, and expectations. It also covers how to sample from common discrete and continuous distributions and calculate correlations between random variables.

Uploaded by

kesisdrderejesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views56 pages

Chapter 5 Sampling in Discrete Even Simulation

This document provides an overview of sampling and random number generation techniques. It discusses the need for sampling and defines key concepts like random variables, probability distributions, and expectations. It also covers how to sample from common discrete and continuous distributions and calculate correlations between random variables.

Uploaded by

kesisdrderejesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

Chapter 5

Sampling
Contents
• Fast review on probabilities and random variables
• Need for sampling
• Random number generation
• Pseudo random number
• Random variate generation: inverse transform
rejection and convolution
• Sampling from discrete distribution, binomial
distribution and Poisson distribution
• Sampling from continuous distribution negative
exponential distribution, uniform distribution and
normal distribution
Fast review on probabilities and random variables

• Probability
– Is measure of uncertainty in occurrence of
random phenomenon
• Occurrence of head or tail in a coin toss experiment
– Is measured on a continuous scale between [0,1]
– Quantifies any likelihood of occurrence
– Frequency ration in multiple experiments
• If n experiments are performed and out of them, in k
of the experiments, phenomena A has occurred then
k
P̂A
n
Sample space, sample points and event
• Sample points
– Each possible outcome of an experiment
• Sample space
– The set of all possible sample points
• Event
– A subset of sample space
• Example- 1) if we toss a coin 1000 times and collect the
outputs
– the 1000 outputs are known as sample points
– S={head, tail} is sample space
– A={Head} is a simple event
Statistical independence and conditional
probability
• Statistical independence
– Two random events are said to be statistically
independent if and only if
PA B PAPB
– Equivalently
PA PA PB PB
B A
• Conditional probability
PA
B
Is conditional probability which means probability of occurrence of event A
given B has occurred
• Conditional probability
– Definition

• Example: In a throw a die experiment,


– P(4)=1/6 but
– P(4/the number obtained is even) =1/3
Mutually exclusive and disjoint events
• Two events A and B are said to be mutually
exclusive if and only if
P(A B) 0
P(A B) P(A) P(B)
• Two events are said to be disjoint if and only if
{A B}
Random variable and variate
• In simulation study, a random variable is known
as variate
• A random variable is a function that assigns a real
number to each possible outcome in an
experiment
• Random variable is represented by capital letter
and its value is represented by small letter
– Example: X=status of a machine in a simulation
• Machine status can be s={idle, busy, failed}
• Random variable can be discrete or
continuous
– If the value assigned is any value in an interval the
random variable is continuous
– If the values that can assume is countable or finite
values, it is discrete
– Example: X- state of server –discrete
• Y- time of failure of server – continuous
Distribution functions
• Describes the probabilistic properties of
random variable
• Can be discrete or continuous
• These are
– Probability mass function
– Cumulative distribution function
– Probability density function
– Joint distribution
Probability mass function

• Consider the random variable X-state of a


machine where the space S of X is given by
S={idle, busy, failed}
• Let each of the possible values of X, idle, busy
and failed be denoted as x1,x2,x3
• If P(X=x1)=0.3 , P(X=x2)=0.6 and P(X=x3)=0.1
• PX(x) – is the probability mass function
• PX(x)=P(X=x) where x is element of S
Properties of pmf
• PX(x) always lies between 0 and 1
0 P x 1
X
• The sum of PX(x) for all x is 1
P x 1
x S
X

• In the previous example


– P(X=idle)+P(X=busy)+P(X=failed)=1
Cumulative distribution function
• Consider the random variable X which is the
outcome of throwing a die
• The space of S={1,2,3,4,5,6}
PX 1 1/ 6 PX 4 2/3
PX 2 1/ 3 PX 5 5/ 6

PX 3 1/ 2 PX 6 1
• The function
FX (x) P(X x) x
Is known as cumulative distribution function
Properties of CDF
• 1 0 FX x 1 where x
• 2 xlim FX x 0 and lim
x
F x 1
X
• 3 if x1<x2 then
F (x1 ) F (x 2 ) P x1 x x 2 F x 2 F x1
X X X X
Example 2. probability of arrival of sum of two numbers on
two faces of a dice
Probability of
occurrence of
number
CDF for the random
variable of the sum of
two numbers in
throwing two dice
Probability density function

• Is the derivative of CDF


• Definition
– If FX x is continuous and differentiable, then the
pdf is defined as
d
f x F x
X dx X x
• Properties of pdf
Properties of pdf cont…
• When calculating the CDF from the pdf, we
use property two and in particular for
x=infinity

• Moreover the pdf can be used to calculate


probability of a given interval
Joint distribution
• Consider N real valued random variables X1,
X2, …, Xn , each will have its own CDF and PDf
given by FX x and f X x
i i

• FX x is called marginal distribution and f X i x


i

marginal density
• Joint distribution is obtained as

• And the pdf


Mutual Independence and IID
• The random variables X1,… Xn are said to be
mutually independent if the joint CDF is the
product of the individual CDF or the joint pdf is
the product of individual pdf

• The random variables X1,… Xn are said to be


independently identically distributed(IID) if each
of them are mutually independent and the
marginal distributions are the same
Expectations and moments
• Expectation or expected value of a random
variable is related to mean or average
• It is the sum of the product of the values of
the random variable and the probability it
takes at those values

• for discrete variable and


• for continuous variable
Example: expected value of random variable in
throwing two dice

Expected value – first moment


Properties of expected values
• Multiplication by constant
– If c is a constant then
• Linearity
– If X and Y are two random variables then

• Independence
– If X and Y are two independent random variables
Moments
• Are expectations of powers of a random
variable
• Kth moment is defined as
• Important moments:
– K=2 defines variance and standard deviation
– K=3 defines skewness
– K=3 defines kurtosis
Moments cont…
• K=2, variance and standard deviation
– variance
– standard deviation
• Both are measures of variability
• But are not linear V aX bY aV(X) bV(Y)
• Example: calculate the second moment of the
two dice experiment
E X2 54.833
Hence variance V[X]=54.833-49=5.833

=(1/36)*((2-7)^2*1+(3-7)^2*2+(4-7)^2*3+(5-7)^2*4+
1*5+ 0*6 +1*5+4*4+9*3+16*2+25)=5.833
Moments cont…
• K=3, skewness – departure from symmetry
• The coefficient of skewness is given as

• Its value can be negative, zero or positive


– Negative – left skewed
– Zero –symmetric
– Positive – right skewed
v[x]=(1/36)*((2-7)^k*1+(3-7)^k*2+(4-7)^k*3+(5-7)^k*4+(6-7)^k*5+0^k*6+
1^k*5+(9-7)^k*4+(10-7)^k*3+(11-7)^k*2+(12-7)^k)=0
Moments cont…
• K=4, fourth moment, influences kurtosis of the
distribution
• Kurtosis – is the degree of ‘fatness’ of a
distribution tail relative to a normal
distribution having the same SD
• Coefficient of kurtosis

80.5
For the two dice example, the kurtosis becomes kX 4
3 0.63
5.833
Kurtosis is negative which means the distribution is less fat than
normal distribution of the same sd
Correlations
• Correlations measure how two random
variables X and Y defined over the same space
vary with respect to each other
• They give notion of their associations
• Correlation coefficient is given as

• Properties
– a)
– b) independent random variables are uncorrelated but the
converse is not true
Common discrete distributions
• 1. Generic discrete distribution
– It is denoted by
• Disc({(pi, vi): i=1,2…}) where each parameter pair (pi,vi)
corresponds to P{X=vi}=pi
• It can model or be used for simulation of various
situations having discrete outcomes
• The probability mass function of discrete distribution is
defined as

1
• Where l{v } ( x ) If x vi
i
0 otherwise
Bernoulli distribution
• A trial with two possible outcomes
– Success or failure, true or false
– Its state space is S={0, 1}
– Is denoted by Ber(p) where p is the probability of
success
– The probability mass function is given by

– Expected value and variance are


Binomial distribution
• Is the sum of n independent Bernoulli random
variables with a common success probability

• It is denoted by B(n, p)
• The probability mass function is given by

• Example: In an examination, total 30 students appeared. If the


probability of passing of one student is 0.6. What is the
probability that none students will pass and 20 students will
pass
30 0 30 0 30 20 30 20
P 0 0.6 1 0.6 0 P 20 0.6 1 0.6 0.1152
X 0 X 20
Bernoulli cont…
• Expected value and variance of Bernoulli is
given by

• For the above example the expectation and


variance will be
– E[X]=30*0.6=1.8 and
– V[X]=30*0.6*(1-0.6)=0.52
Poisson distribution
• It is often used to model the number of
occurrences of a random variable in a time
interval
• Example: number of customer demands in a
time interval
• Denoted by pois( )
• The pmf of Poisson distribution is given by

• The mean and variance are same


Continuous distributions
• Uniform
– Assumes values in an interval S=[a, b] with b>a
– Each value is equally likely
– Denoted by Unif(a,b)
– The pdf and CDF are given by

– Mean and variance are given by


Continuous distribution cont…

• Exponential distribution
– Assumes values in the positive half line
– It models random inter arrival times in continuous
time, especially when they are iid
– Denoted by Expo( )
– being rate parameter
– The pdf and CDF are

– The mean and variance are given by


Figure above is the PDF of an exponential
distribution. Determine the rate parameter
Normal distribution
• Assumes any value on the real line
• It is denoted by Norm ,
2

• is mean or scale parameter and is the


2

variance which is the shape parameter


• Norm(0,1) is known as standard normal
distribution
• Pdf is given by
Pdf of Norm(0,1)
Need for sampling
• Sample realizations of random variables with
prescribed distributions
• Random numbers generators are used
• Actual sequence is pseudo random number
• IID numbers uniformly distributed between 0
and 1
• Then transformed to desired distribution
• Randomness test has to be performed
Simple example

• Consider the following random integer


generator
xi ax i 1 c (mod m)
• Xo is the initial seed, a, c and m are
parameters of the generator
• Example: choose an initial seed xo and
generate 10 random integers. Take a=6, c=17
and m=22
Conditions to achieve non repeating series

• i) c is relatively prime, i.e. c and m should not


have common divisor
• ii) a 1 (mod)g for every prime factor of g
– i.e. a=1+gk where k={a/g}
• iii) a 1 (mod ) if m is a multiple of 4
– i.e. a=1+4{a/4}
Properties of random numbers

• Two important statistical properties


– 1. uniformity
– 2. independence
Random number generation

• Objective:
– Given a probability distribution, generate random
number which conform to it
• Three problems
– Single variate generation – generate X from a
given distribution
– IId process generation- generate iid sequance of
variates {Xn} with common distribution
– Non iid process generation
Random variate generation: inverse transform
rejection and convolution
• Let X be the random variate to be generated
having pdf fX (x) and CDF FX (x)

• CDF varies from 0 to 1


• Unif(0,1) also varies between 0 and 1
• If y=F(x) then, x=F-1(y)
Procedure to generate random number using
inverse method
• 1) generate uniform random number in the
range (0,1)
• 2) use the generated random number and the
CDF to find X
Drawbacks of inverse method
• It may be difficult to get CDF
• It may not be invertible
Example: 1. Generate uniform random number

• Generate a realization x from a random


uniform distribution Unif(2,10) for a
realization u=0.23
x a
u
b a

• Solving for x x u (b a ) a

x (10 2) * 0.23 2 3.84


Example 2: Generation of exponential variates

• Generate three random numbers from


exponential distribution with mean 2, i.e.
Expo(0.5) x
u 1 e
1 1
x F (u )
X ln 1 u

• Step 1. Generate three random numbers using


the linear congruence method
• Step 2. use second equation to find x
Example 2 cont…
• Let Seed=5, a=3, c=3 and m=7 then
• u1=(3*5+3) mod 7=4 then 4/7=0.57
1
x ln 1 0.57 1.69
0.5

• u2=(3*4+3) mod 7=1 then 1/7=0.14


1
x ln 1 0.14 0.3
0.5

• u3=(3*1+3) mod 7=6 then 6/7=0.86


1
x ln 1 0.86 3.93
0.5
Normal distribution

• Exercise – generate four random numbers


from normal distribution
Generation of discrete variate

• Follow the same procedure as the continuous


• CDF for discrete distribution is
0 if x<1
u k
Pi k x k+1
i 1

• The Inverse CDF is


k 1 k
x FX1 u for pn u pn
n 1 n 1
Example: generate two random integers corresponding to the
sample values u=0.23 and u=0.45 when the pmf is as shown
below

• Pmf of the discrete distribution

• The CDF of the distribution can be drawn as


shown below
From the figure,

For u=0.45, we have x=4 and for u=0.23,


x=2
Generating random variables from Poisson
distribution
• Exercise -
Generating random variables from Binomial
distribution
• Exercise
Generation of stochastic process

• Generating stochastic process is generation of


sequence of variates with probability distribution
• IID process generation
– Generation is similar
– Steps
• 1) determine the seeds for the RNG
• 2) generate random numbers using each seed
• 3) using the CDF of the given distribution, find the inverse
and generate the random sequence elements
• 4) repeat the steps until the required number is obtained

You might also like