0% found this document useful (0 votes)
9 views50 pages

4DC10 Week 3 Lecture

Uploaded by

gamingotakusinfo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views50 pages

4DC10 Week 3 Lecture

Uploaded by

gamingotakusinfo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 50

Analysis of Production Systems (4DC10)

Week 3

Michel Reniers

Department of Mechanical Engineering


Recap: Random variables, expectation and variance
• Random variable X assigns a numerical value to each outcome of an experiment
• Discrete random variable X takes discrete values, x1 , x2 , ...
• Function pj = P(X = xj ) is the probability mass distribution
• Expected value (or “centre of probability mass”)

X
E [X] = xj p j
j=1

• Measure for variability of X around E [X] is the variance


h i
Var [X] = E (X − E [X])2
h i
= E X 2 − (E [X])2

• Standard deviation of X is the square root of the variance, σ(X) =


p
Var [X]
2
Example: KIVA
Robot Betty has to retrieve a rack in an aisle with N (1 meter spaced) positions
Rack that has to be retrieved is randomly located in the aisle

1 meter

1 2 N

Question: What is the expected distance Betty has to travel into the aisle?

3
Example: KIVA
Robot Betty has to retrieve a rack in an aisle with N (1 meter spaced) positions
Rack that has to be retrieved is randomly located in the aisle

1 meter

1 2 N

Question: What is the expected distance Betty has to travel into the aisle?
Answer: Let X be location of retrieval, then travel distance is X
N N
X X 1 1 N(1 + N) 1+N
E [X] = nP(X = n) = n· = · = meter
3 N N 2 2
n=1 n=1
Discrete random variables: Bernoulli

Bernoulli random variable X with success probability p


P(X = 1) = p = 1 − P(X = 0)

Then
E [X] = 0 · P(X = 0) + 1 · P(X = 1) = p
h i
E X2 = 02 · P(X = 0) + 12 · P(X = 1) = p
so
h i
Var [X] = E X 2 − (E [X])2 = p − p2 = p(1 − p)

4
Discrete random variables: Geometric
Geometric random variable X is the number of Bernoulli trials till the first success
Each trial with success probability p
Hence for k = 1, 2, ...
P(X = k) = (1 − p)k−1 p

Then

X
E [X] = k(1 − p)k−1 p
k=1

X
= p k(1 − p)k−1
k=1
1 1
= p =
p2 p
5
Justification


X
E [X] = k(1 − p)k−1 p
k=1

X
= p k(1 − p)k−1
k=1

X
= p (j + 1)(1 − p)j
j=0
 
X∞ ∞
X
= p j(1 − p)j + (1 − p)j 
j=0 j=0

1−p 1 1
 
= p + =
p2 p p
6
Discrete random variables: Geometric
0.8
0.6

Geometric distribution with


success probability p = 0.9, 0.5, 0.2
prob

0.4
0.2
0.0

1 3 5 7 9

7
Discrete random variables: Binomial

Binomial random variable X is the number of successes in n independent Bernoulli


trials X1 , ... , Xn

Each trial with success probability p

Hence for k = 0, 1, ... n


 
n k
P(X = k) = p (1 − p)n−k
k

Since X = X1 + · · · + Xn , we get
E [X] = E [X1 ] + · · · + E [Xn ] = np
Var [X] = Var [X1 ] + · · · + Var [Xn ] = np(1 − p)

8
Discrete random variables: Binomial
0.20
0.15

Binomial distribution with


parameters n = 20 and
prob

0.10

p = 0.25, 0.5, 0.65


0.05
0.00

0 5 10 15 20

9
Discrete random variables: Poisson
Poisson random variable X with parameter λ > 0 expresses the probability of a given
number of events occurring in a fixed interval of time if these events occur with a
known constant mean rate λ and independently of the time since the last event
λk
P(X = k) = e−λ , k = 0, 1, 2, ...
k!
Then

X
E [X] = kP(X = k)
k=0

X λk
= ke−λ
k!
k=1

X λk
= e−λ k
k!
k=0
−λ λ
10 = e λe = λ
Discrete random variables: Poisson
0.3

Poisson distribution with


0.2

parameter λ = 1, 5, 10
prob

0.1
0.0

0 5 10 15 20

11
Continuous random variable

A continuous random variable X can take continuous values x ∈ R


Its probability distribution function F(x) is of the form
Z x
F(x) = P(X ≤ x) = f (y)dy
−∞
where the function f (x) satisfies
Z ∞
f (x) ≥ 0 for all x, f (x)dx = 1
−∞

Function f (x) is called the probability density of X and it is the derivative of F(x)
d
f (x) = F(x)
dx

12
Continuous random variable

Probability that X takes a value in the interval (a, b] can be calculated as


P(a < X ≤ b) = P(X ≤ b) − P(X ≤ a)
Z b Z a
= f (x)dx − f (x)dx
−∞ −∞
Z b
= f (x)dx
a

Probability that X takes a specific value b is equal to 0


Z b Z b
P(X = b) = lim P(a < X ≤ b) = lim f (x)dx = f (x)dx = 0
a↑b a↑b a b

13
Expected value and variance

For a continuous random variable X with density f (x), its expected value is defined as
Z ∞
E [X] = xf (x)dx
−∞

Variance of X is
h i Z ∞
Var [X] = E (X − E(X))2 = (x − E [X])2 f (x)dx
−∞

Standard deviation of X is the square root of the variance


p
σ(X) = Var [X]

14
Recipe for calculating densities

First determine F(x) = P(X ≤ x)

Then differentiate!

Example: X is distance to 0 of a random point in a disk of radius r

Question: What is density of X?

15
Recipe for calculating densities

First determine F(x) = P(X ≤ x)

Then differentiate!

Example: X is distance to 0 of a random point in a disk of radius r

Question: What is density of X?

Answer:
πx2 x2
F(x) = P(X ≤ x) = = , 0≤x≤r
πr 2 r2
so
d 2x
f (x) = F(x) = 2 , 0≤x≤r
dx r
15
Continuous random variable: Uniform
Uniform random variable X on [a, b] has density
(
1
a<x<b
f (x) = b−a
0 otherwise

Then
x
1 x−a
Z
P(X ≤ x) = dx = , a<x<b
a b−a b−a
b
x a+b
Z
E [X] = dx =
a b−a 2
x2 dx
b
b3 − a3
h i Z
E X2 = =
a b−a 3(b − a)
h i 1
Var [X] = E X 2 − (E [X])2 = (b − a)2
12
16
Continuous random variable: Exponential

Exponential distribution is used to model the time between events that occur
randomly and independently.

Examples of situations that can be modeled by the exponential distribution are:

• time until an earthquake, a customer, or a phone call occurs


• time until a radioactive particle decays
• distance between mutations on a DNA strand
• height of molecules in a gas at a stable temperature and pressure
• monthly and annual highest values of rainfall and river outflow volumes
17
Continuous random variable: Exponential

Exponential random variable X with parameter (or rate) λ > 0 has density
(
λe−λx x > 0
3.0

f (x) =
0 otherwise
2.5
2.0

Distribution P(X ≤ x) = 1 − e−λx


f(x)

1.5
1.0

1
Expectation E [X] = λ
0.5
0.0

1
0 1 2 3 4 5 Variance Var [X] = λ2
x

18
Properties of Exponential

Memoryless property: For all t, s > 0


P(X > t + s, X > s) e−λ(t+s)
P(X > t + s|X > s) = = = e−λt = P(X > t)
P(X > s) e−λs

Minimum: For two independent exponentials X1 and X2 with rates λ1 and λ2


P(min(X1 , X2 ) ≤ x) = 1 − P(min(X1 , X2 ) > x)
= 1 − P(X1 > x, X2 > x)
= 1 − P(X1 > x)P(X2 > x)
= 1 − e−(λ1 +λ2 )x
So min(X1 , X2 ) is Exponential with rate λ1 + λ2

19
Example: One-time business decision

Demand during a single period is described by a continuous random variable X with


density
f (x) = µe−µx , x>0

Suppose you decide to put Q units on stock to meet the demand during a single period
Question: How much to put on stock such that the stock-out probability is no more
than 10%?

20
Example: One-time business decision

Demand during a single period is described by a continuous random variable X with


density
f (x) = µe−µx , x>0

Suppose you decide to put Q units on stock to meet the demand during a single period
Question: How much to put on stock such that the stock-out probability is no more
than 10%?
Answer: Determine Q such that
1
P(X > Q) = e−µQ ≤
10
so
ln(10)
Q≥
µ
20
Continuous random variable: Normal

Normal random variable X with parameters µ and σ > 0


1 1 2 2
f (x) = √ e− 2 (x−µ) /σ , −∞ < x < ∞
σ 2π
Then
E [X] = µ, Var [X] = σ 2

Density f (x) is denoted as N(µ, σ 2 ) density

Standard normal random variable X has N(0, 1) density


1 1 2
f (x) = ϕ(x) = √ e− 2 x

and
x
1
Z
1 2
P(X ≤ x) = Φ(x) = √ e− 2 y dy
2π −∞
21
Normal distribution

0.4
0.3
Normal density

0.2
0.1
0.0

−5 0 5

Density of Normal distribution for µ = 0 and σ = 1(black), 2(red), 5(blue)


22
Properties of Normal

Linearity: If X is Normal with parameters µ and σ, then aX + b is Normal with


parameters aµ + b and aσ

Additivity: If X and Y are independent and Normal, then sum X + Y is also Normal

Question: Suppose X has parameters µX and σX and Y has parameters µY and σY What
are the mean and variance of X + Y?

23
Properties of Normal

Linearity: If X is Normal with parameters µ and σ, then aX + b is Normal with


parameters aµ + b and aσ

Additivity: If X and Y are independent and Normal, then sum X + Y is also Normal

Question: Suppose X has parameters µX and σX and Y has parameters µY and σY What
are the mean and variance of X + Y?

Answer: Mean of X + Y is µX + µY and variance is σX2 + σY2

23
Recap: System capacity
• Manufacturing system of workstations W1 , ... , WN with mi parallel and identical
machines in workstation Wi , raw processing time t0i
• Arrival rate to the manufacturing system is λ jobs per unit time, fraction qi of
arrival flow is diverted to Wi . Fraction pij of throughput of workstation Wi diverted
to Wj , fraction pi0 of throughput leaves the system
• Throughput δi of workstation Wi satisfies conservation of flow:
N
X
δi = λqi + δj pji , i = 1, ... , N
j=1

• Utilization of workstation Wi is ui = δi t0i


mi <1
• Bottleneck workstation Wb is the one with the highest utilization ub
• Maximal inflow λmax is the rate for which ub becomes equal to 1

24
scrap
Example 0.2

λ W1

0.3
λ W3 W4 δ

0.8

λ W2

Balance equations Solution


δ1 = λ + 0.2δ3 δ1 = 1.7λ
δ2 = λ + 0.8δ3 δ2 = 3.8λ
δ3 = λ + 0.8δ1 + 0.3δ4 δ3 = 3.5λ
25
δ4 = δ2 δ4 = 3.8λ
Balance of flow to the whole system: δ + 0.2δ1 = 3λ

Assume single machine workstations with process times


t01 = 5.0, t02 = 2.5, t03 = 3.0, t04 = 2.0 hours
Then utilization
u1 = 8.5λ, u2 = 9.5λ, u3 = 10.5λ, u4 = 7.6λ

Workstation W3 is the bottleneck

1
Maximal inflow rate is λmax = 10.5 = 0.095 jobs per hour

0.2δ1 0.34
Fraction of jobs scrapped is 3λ = 3 = 0.113

26
Little’s Law

Fundamental relationship among WIP, cycle time and throughput


• w is the average WIP level in the system
• δ is the throughput of the system
• φ is the average flow time in the system

Then Little’s law states that w = δφ


w

System
δ δ
ϕ

Remark: Little holds for stable system: inflow equals throughput (no job accumulation)
27
w
Little’s Law δ = φ

total number w

time t

solid line is total input to the system in (0, t), dashed line is total output in (0, t)
28
w
Little’s Law δ = φ

Applied to buffer:
WIP in the buffer of workstation W = Throughput of W × Time spent in buffer

Applied to whole manufacturing system:


WIP in the whole manufacturing system = Throughput of the system × Cycle time

According to Little’s law, the same throughput δ can be achieved with


• large WIP w and long flow times φ
• small WIP w and short flow times φ

Question: What causes the difference?


29
w
Little’s Law δ = φ

Applied to buffer:
WIP in the buffer of workstation W = Throughput of W × Time spent in buffer

Applied to whole manufacturing system:


WIP in the whole manufacturing system = Throughput of the system × Cycle time

According to Little’s law, the same throughput δ can be achieved with


• large WIP w and long flow times φ
• small WIP w and short flow times φ

Question: What causes the difference? Answer: Variability!


29
Variability

Controllable variation: Result of (bad) decisions


• Variability in mix of products produced by the plant
• Batch movement of material (first finished part waits longer to move than last one)

Random variation: Result of events beyond our control


• Time elapsing between customer demands
• Machine failures

Variability in manufacturing systems:


• Process time variability
• Flow variability

30
Variability

First moment intuition (mean):


• Get more products out by speeding up bottleneck machine or adding more
product carriers

Second moment intuition (variance):


• Which is more variable: time to process individual part or a whole batch of those
parts?
• Which results in greater improvement of line performance:
– Reduce variability of process times closer to raw materials (upstream the line)?
– Reduce variability of process times closer to customers (downstream the line)?
• Which are more disruptive:
– Short frequent machine failures?
– Long infrequent machine failures?

31
Process time variability

Effective process time is total time seen by a job at a station, including


– natural process time
– random failures
– setups
– rework
– operator unavailability
– and other shop floor realities

Standard deviation σ is absolute measure of variability: σ = 1 hour is big for mean


process time t = 10 minutes, but small for t = 10 hours
Coefficient of variation c is relative measure of variability
standard deviation σ
c= =
mean t
32
Process time variability

Sources of variability
• “Natural” variability (differences in operators, machines, material)
• Random outages (failures)
• Setups
• Operator (un)availability
• Rework

Classes of variability
• Low c < 0.75: Process times without outages
• Moderate 0.75 ≤ c < 1.33: Process times with short outages (setups)
• High c ≥ 1.33: Process times with long outages (failures)

33
Natural variability

Catch-all category due to differences in:


• skills and experience of operators
• machines
• composition of material

Natural coefficient of variation


standard deviation of natural process time σ0
c0 = =
mean of natural process time t0

Natural process times typically have low variability: c0 < 0.75

34
Preemptive outages

Uncontrollable down-times randomly occurring during processing such as

• machine breakdowns

• power downs

• operators being called away

• running out of raw material

35
Preemptive outages: Effect on machine capacity
Availability (long-run fraction of time machine is available)

mf
A=
mf + mr
where
• mf is mean time to failure
• mr is mean time to repair
Adjusting natural process time t0 to effective process time

t0
te =
A
and effective capacity of workstation with m machines

m m
re = = A = Ar0
36 te t0
Preemptive outages: Effect on machine capacity – Example

Machine M1 : t0 = 15, σ0 = 3.35, c0 = 0.223, mf = 744, mr = 248


Machine M2 : t0 = 15, σ0 = 3.35, c0 = 0.223, mf = 114, mr = 38

So M1 has infrequent long stops, M2 has frequent short stops

For both machines


1
A = 0.75, te = 20, re =
20

So M1 and M2 have the same effective capacity

37
Preemptive outages: Effect on process time variability

Assumptions

• Time to failure is Exponential with rate rf , so


P(failure in (t, t + ∆)) = 1 − e−rf ∆ ≈ rf ∆ in every small time interval (t, t + ∆)

so failures are “truly unpredictable”

• After repair, processing resumes at the point where it was interrupted by the
failure

38
Preemptive outages: Effect on process time variability

Mean and variance of effective process time

t0
te =
A
 σ 2 (m2 + σ 2 )(1 − A)t
0 0
σe2 = + r r
A Amr
σe2 mr
ce2 = 2
= c02 + (1 + cr2 )A(1 − A)
te t0
m mr
c02 + A(1 − A) + cr2 A(1 − A)
r
=
t0 t0

where σr is standard deviation of time to repair and cr = σr


mr is coefficient of variation
39
Preemptive outages: Effect on process time variability

Interpretation of squared coefficient of variation of effective process time

mr mr
ce2 = c02 + A(1 − A) + cr2 A(1 − A)
t0 t0

• c02 is natural variability

• A(1 − A) mt0r is variability due to occurrence of random breakdowns

• cr2 A(1 − A) mt0r is variability due to the repair times

Note: ce2 increases in mr , so long repair times induce more variability than short ones
40
Preemptive outages: Effect on process time variability – Example

Machine M1 : t0 = 15, σ0 = 3.35, c0 = 0.223, mf = 744, mr = 248, cr = 1


Machine M2 : t0 = 15, σ0 = 3.35, c0 = 0.223, mf = 114, mr = 38, cr = 1

For machine M1
ce2 = 6.25
and for machine M2
ce2 = 1

So machine M1 exhibits much more variability than M2

Conclusion: From the viewpoint of variability, it is better to have short frequent stops
than long infrequent ones!
41
Nonpreemptive outages

Controllable down-times (not occurring during processing) such as

• tool changes

• setups

• preventive maintenance

• shift changes

42
Nonpreemptive outages: Effect on machine capacity – Example
Machine needs setup with mean ts and coefficient of variation cs after having produced
on average Ns jobs (with probability N1s a setup is performed after processing a job)
Availability
Ns t0
A=
Ns t0 + ts
Effective process time
t0 ts
te = = t0 +
A Ns
Effective capacity of workstation with m machines
m m
re = = A = Ar0
te t0
Variance of the effective process time
σs2 Ns − 1 2
43 σe2 = σ02 + + t
Ns Ns2 s
Nonpreemptive outages: Effect on process time variability – Example

Machine M1 is flexible, no setups: t0 = 1.2 hours, c0 = 0.5


Machine M2 is fast, with setups: t0 = 1, c0 = 0.25, Ns = 10, ts = 2, cs = 0.25

Then te = 1.2 for both machines M1 and M2 , so they have the same effective capacity

Question: Which machine has less variability?

44
Nonpreemptive outages: Effect on process time variability – Example

Machine M1 is flexible, no setups: t0 = 1.2 hours, c0 = 0.5


Machine M2 is fast, with setups: t0 = 1, c0 = 0.25, Ns = 10, ts = 2, cs = 0.25

Then te = 1.2 for both machines M1 and M2 , so they have the same effective capacity

Question: Which machine has less variability?

Answer: For M1
ce2 = c02 = 0.25
and for M2
ce2 = 0.29

So flexible machine M1 exhibits less variability than M2 !


44

You might also like