0% found this document useful (0 votes)
2 views74 pages

Chapter 1.3 Markov Chain

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views74 pages

Chapter 1.3 Markov Chain

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 74

Stochastic model and Markov

chain
Recap
• Briefing Deterministic model
• Unique nature of Deterministic model

• Expectations from Stochastic model


Introduction to Stochastic Model
• The word "stochastic" derives from the Greek word
which has a meaning of ( to aim, to guess) and means
"random" or "chance." The antonym is "sure,"
"deterministic,“ or "certain.“
• A deterministic model predicts a single outcome from
a given set of circumstances.
• A stochastic model predicts a set of possible outcomes
weighted by their likelihoods, or probabilities.
• A coin flipped into the air will surely return to earth
somewhere. Whether it lands heads or tails is random.
• However, phenomena are not in and of
themselves inherently stochastic or
deterministic. Rather, to model a
phenomenon as stochastic or deterministic is
the choice of the observer.
• The choice depends on the observer's
purpose; the criterion for judging the choice is
usefulness
Stochastic Processes
• In Probability Theory, a stochastic process (or random
process) is a collection of (indexed) random variables (r.v.).
These collections of r.v. are frequently used to represent the
evolution of a random quantity (X) over time (t)
• A stochastic process is the random analogue of a
deterministic process: even if the initial condition is known,
there are several (often infinitely many) directions in which
the process may evolve.
• Stochastic processes are distinguished by their state space,
or the range of possible values for the random variables X
by their index set T, and by the dependence relations among
the random variables X,.
• A stochastic process is a family of random variables X
where t is a parameter running over a suitable index
set T. (Where convenient, we will write X(t) instead of
Xt)
• In a common situation, the index t corresponds to
discrete units of time, and the index set is T = {0, 1, 2,
. . .}.
• In this case, X, might represent the outcomes at
successive tosses of a coin, repeated responses of a
subject in a learning experiment, or successive
observations of some characteristics of a certain
Relevance of Stochastic process
• The relevance of stochastic processes in practice can be described
by mentioning a brief list of some of the important areas in which
stochastic processes arise:
• 1. Economics — we frequently deal with daily stock market
quotations or monthly unemployment figures;
• 2. Social sciences — population birth rates and school enrollments
series have been followed for many centuries in several countries;
• 3. Epidemiology — numbers of influenza cases are often
monitored over long periods of time;
• 4. Medicine — blood pressure measurements are traced over
time to evaluate the impact of pharmaceutical drugs used in
treating hypertension
Probability Spaces and Random Variables

• Several parameters of a problem can be considered


uncertain and are thus represented as random
variables.
– Production and distribution costs typically depend on fuel
costs, which are random.
– Future demands depend on uncertain market conditions.
– Crop returns depend on uncertain weather conditions.
The Poisson distribution is one of three discrete distributions,
Binomial, Poisson, and Hypergeometric, that use integers as
random variables.
The Poisson Distribution
• Simeon-Denis Poisson, 1781–1840, a mathematician
was known for his work in the area of definite
integrals, electrical theory and probability.
• In 1837, Poisson derived his distribution to
approximate the Binomial Distribution when the
probability of occurrence (p) is small.
• A use for this distribution was not found until 1898;
when an individual named Bortkiewicz was tasked by
the Prussian Army to investigate accidental deaths of
solders attributed to being kicked by horses.
The Poisson Distribution
• The initial application of the Poisson
distribution to determine the number of
deaths attributed to horsed kicks in the
Prussian Army led to its use in analyzing
accidental deaths, service requirements, errors
over time, and its use in reliability engineering.
• The Poisson distribution usually is applicable in
situations where random “events” occur at a
certain rate over a period of time.
The Poisson Distribution
Consider the following scenarios:
• The hourly number of customers arriving at a bank
• The daily number of accidents on Adama express
highway
• The hourly number of accesses to a particular web server
• The daily number of emergency calls in Addis
• The monthly number of employees who had an absence
in a large company
• Monthly demands for a particular product
• All of these are situations where the Poisson distribution
may be applicable.
The Poisson Distribution
• the Poisson distribution arises when a set of canonical
assumptions are reasonably valid. These are:
– The number of events that occur in any time interval is independent
of the number of events in any other disjoint interval. Here, “time
interval” is the standard example of an “exposure variable” and other
interpretations are possible. Example: Error rate per page in a book.
– The distribution of number of events in an interval is the same for all
intervals of the same size.
– For a “small” time interval, the probability of observing an event is
proportional to the length of the interval. The proportionality
constant corresponds to the “rate” at which events occur.
– The probability of observing two or more events in an interval
approaches zero as the interval becomes smaller.
The Poisson equation for predicting the
probability of a specific number of defects
or failures (r) in time (t) is:
To calculate the probability of k or fewer
failures occurring in time t, the probability of
each failure occurring must be summed as
shown in Equation (2).
Example on Poisson process
• For example, assume that the population of a
component has a failure rate of 121.7 failures
per one million hours. The component is
expected to operate 43,800 hours and only 2
failures are expected to occur.
A. Calculate the probability of exactly 0, 1, and 2
failures,
B. Find the probability of two or fewer failures
C. Calculate the confidence level of the above failure
Solution
.1 The probability of occurrence of no or zero(0) failure

A.2 The probability of occurrence of one(0) failure

A.3 The probability of occurrence of two(2) failure


Solution
B. Using Equation (2), the probability of two or fewer
failures is the sum of these probabilities. That is:

P(r <k) = 0.0048 + 0.0256 + 0.0682 = 0.0986

C. Using Equation (3)

we can determine that the Confidence Level is


approximately 90% that the failure rate of the population
is 121.7 failures per one million hours.
CL = 1 - P(r <k) = 1 - 0.0986 = 0.9014 or 90.14%
Poisson application - Sparing
• One of the other application of Poisson distribution is in
determining the number of spare line replacement units
that should be initially available to ensure a preselected
probability that a spare is available.
• Frederic J. O’Neal of Bell Laboratories developed such a
spaing equation in 1970s for electronic systems.
• In 1989 RAMS article, Al Myrick modified this equation to
assure availability for a desired confidence level.
Chapter-1.3
Markov Analysis (MA)

Contents:
• Characteristics of Markov analysis
• Application of Markov analysis
• State and transitions probabilities
In a Markov chain, the future depends only upon the present:
NOT upon the past.
Introduction
• The text-book image of a Markov chain has a flea
hopping about at random on the vertices of the
transition diagram, according to the probabilities
shown. The transition diagram below shows a system
with 7 possible states:
• state space S = { 1,2,3,4,5,6,7}
Introduction
• Questions of interest
• Starting from state 1, what is the probability of ever
reaching state 7?
• Starting from state 2, what is the expected time
taken to reach state 4?
•Starting from state 2, what is the long-run proportion
of time spent in state 3?
• Starting from state 1, what is the probability of being
in state 2 at time t ? Does the probability converge
as t→∞, and if so, to what?
Introduction
• A Markov chain is a mathematical model of a
random phenomenon evolving with time in a way
that the past affects the future only through the
present.
• The “time” can be discrete (i.e. the integers),
continuous (i.e. the real numbers), or, more
generally, a totally ordered set.
• In Mathematics, a phenomenon which evolves with
time in a way that only the present affects the
future is called a dynamical system.
Markov chains and Markov processes
• Important classes of stochastic processes are
Markov chains and Markov processes.
• A Markov chain is a discrete-time process for which
the future behavior, given the past and the present,
only depends on the present and not on the past.
• A Markov process is the continuous-time version of
a Markov chain.
• Many queuing models are in fact Markov processes.
Markov chain, characteristics
 It is a particular class of probabilistic models known as stochastic

processes, in which the current state of a system depends on all


of its previous states.
 In Markov process the current state of a system depends only on

its immediately preceding state.


 Markov chain is developed by the Russian mathematician

Andrey Markov in 1905.


Probability of mutually dependent events
Concept of chained events
Characteristics of a Marcov Chain
 There are finite number of possible states
 The transition probabilities depend only on the current state
of the system
 The long-run probability of being in a particular state will be
constant over time.
 The transition probabilities of moving to alternative states in
the next time period, given a state in the current time period
must sum to 1.0.
Markov chain…cont’d
 Consider a discrete time finite-state (S) Markov

chain { Xt , t= 1,2,3,…} with stationary transition


probabilities.
 P[ Xt+1 = j/Xt= i] = Pij, i,j ϵ S .

 Let , P= (Pij) denote the matrix of transition

probabilities.
 The transition probabilities between Xt and Xt+n are

noted P(n)ij and the transition matrix p(n) = Pn .


States and Transition Probabilities
• Predicting future states involves knowing system’s
likelihood or probability of changing from one state to
another.
 These probabilities can be collected and placed in a matrix.
 Such matrix is called the matrix of transition probabilities
(transition matrix)
Succeeding state
S1 S2 Sm

P= [pij]mxm = Initial state

and 0≤ Pij ≤ 1.
Application of Markov Analysis
Production: helpful in evaluating alternative maintenance
policies, certain classes of inventory and queuing problems,
inspection analysis.
Marketing: useful in analysis and predicting customer’s buying
behavior in terms of loyalty to a particular product brand,
switching patterns to other brands, and marketing share of the
company versus its competitors
Personnel: Determining future requirements of an organization
taking into consideration retirements, deaths, resignations, etc.
Finance: Customer accounts receivable behavior.
Example Markov chain in weather
prediction
• Design a Markov Chain to predict the weather of
tomorrow using previous information of the past
days.

𝑆 = 𝑆1, 𝑆2, 𝑆3 , and the name


• Our model has only 3 states:

of each state is 𝑆1 = 𝑆𝑢𝑛𝑛𝑦 ,


𝑆2 = 𝑅𝑎𝑖𝑛𝑦, 𝑆3 = 𝐶𝑙𝑜𝑢𝑑𝑦.
• To establish the transition probabilities relationship
between states we will need to collect data.
• Assume the data produces the following transition
Example Markov chain in weather
prediction
• Assume the data produces the following
transition probabilities:
Example Markov chain in weather
prediction
• Let’s say we have a sequence: Sunny, Rainy, Cloudy, Cloudy,
Sunny, Sunny, Sunny, Rainy, ….; so, in a day we can be in any of

• We can use the following state sequence notation: 𝑞1, 𝑞2,


the three states.

𝑞3, 𝑞4, 𝑞5,….., where 𝑞𝑖 𝜖 {𝑆𝑢𝑛𝑛𝑦,𝑅𝑎𝑖𝑛𝑦,𝐶𝑙𝑜𝑢𝑑𝑦}.


• In order to compute the probability of tomorrow’s weather
we can use the Markov property:

• Exercise 1: Given that today is Sunny, what’s the probability

• 𝑃 (𝑞2,𝑞3 |𝑞1) = 𝑃 (𝑞2\ 𝑞1) 𝑃 (𝑞3 \𝑞1,𝑞2)


that tomorrow is Sunny and the next day Rainy?
Example Markov chain in weather
prediction
Example Markov chain in weather
prediction
Markov Analysis -Brand switching problem

• Consider the brand switching problem in gasoline service


stations, the probability of customers changing the
service station overtime is as given in table below.

Table .1: probability of customer movement per month

Next month
This month Petroco National

Petroco .60 .40


National .20 .80
Customer brand switching problem…cont’d
 The above table fulfills the marcov characteristics
(assumptions)
The probability of moving from state to all others sum to
one.
The probabilities apply to all system participants,
The events are independent

• Suppose the service stations wanted to know the


probability that a customer trades with them in the
future (say month 3), trading with them at this
month(1)
Solution
Next month Next month

This This
→ month
month

.60

.40
Petroco National .80

.20
Fig. .1 Transition Diagram for the brand switching problem
Time (shift) Joint
probability
1 2 3
P 11=
.6 0 Petroco .36
.6 0 Petroco
P1
=
1
P12 =.
40 National .24
Petroco
P1 =
2 . P 21=
.2 0 Petroco .08
40
National
P22 =.80 National .32
Fig.2: Tree diagram Sum = 1.00
If a customer trades with pertoco in month (1)
 The probability that a customer purchasing gasoline from Petroco
in month 3;
.36 + .08 = .44
 The probability of a customer’s trading with National in month 3
is
Time (shift) Joint
probability
1 2 3
= .6 0 Petroco
P1 1 .12
= .2
0 Petroco
P1
1 P12=
. 40 National .08
National
P12 P2 1= .2 0 Petroco .16
= .80
National
P22=.8
0 National .64
Fig.3: Tree diagram Sum = 1.00
If a customer trades with National in month (1)
 The probability that a customer purchasing gasoline from Petroco
in month 3;
.12 + .16 = .28
 The probability of a customer’s trading with National in month 3
is
Solution… cont’d

Table 2: probability of customer


movement in month 3
Probability of trade in
Starting state month 3
Petroco National

Petroco .44 .56


National .28 .72
• Let there be only three manufacturer(A, B and C) of producing macaroni in
hawassa. It has been observed that during the previous month A sold a
total of 120tonns, B sold a total of 203 tone and manufacturer C sold
377tonn. It is known to all that the customers do not always purchase
from the same manufacturer because of different reasons. The following
table gives information regarding the movement of customers and further
it is assumed that no new customers is not allowed to enter the market
and no old customers has left the market
Previous New month
The manufacturer of the three month
factories want to know A B C Total
a. Should the advertising
campaign of manufacturer C A 85 8 7 100
be directed towards attracting B 20 160 20 200
previous purchaser of
C 15 35 350 400
manufacturer A or B, or
should it concentrate on Total 120 203 377 700
retaining a large proportion of
the previous purchaser of
Macaroni by C?
Multi –Period Transition probabilities and the
Transition Matrix
 One of the purpose of the markov analysis is to predict the future.
 The elements of the n-step transition matrix Pn = [pnij] m x m are
obtained by repeatedly multiplying the transition matrix P by it self

Pn = p n-1 x P
• Let, V = represents the vector transition matrix ( fore example V 1=
Vector of state probabilities at period, (n=1), then

• Where each row i of Vn represents the state probability distribution


after n transitions, given that the process starts out in state i.
Transition matrix probability… cont’d

• Using transition matrix solve the customer’s


brand switching problem for month 3.
Next month
This month Petroco National

Petroco .60 .40


National .20 .80
Brand switching problem…cont’d

and

P= the transition matrix


Vi= the transition probability matrix after i=n periods,
n=1,2,3,….
V1(month1),
 If a customer initially trade with Petroco , p111= 1 and p112= 0;
 If a customer initially trade with National, P121= 0 and P122= 1
Brand switching problem…cont’d

V2(month2) = V1P
= =

V3 (month2) = V2P

= =

Similarly we can determine the probabilities of


customer’s brand switching problem for Month
4,5,6,7,8,9 ….
Steady- state probability
• The steady-state probabilities are average probabilities that
the system will be in a certain state after a large number of
transition periods.
• This does not mean the system stays in one state. The system
will continue to move from state to state in future time
periods; however, the average probabilities of moving from
state to state for all periods will remain constant in the long
run.
• In a Markov process, after a number of periods have passed,
the probabilities will approach steady state.
Steady- state probability
• Steady-state probabilities are average, constant
probabilities that the system will be in a state in
the future,
• Vi+1 = Vi → =

Example, for the Gasoline service probability matrix , determine


the Steady-state probabilities

Solution
To determine the steady-state probability for period i+1, we
normally do the following equations.
Steady- state probability …cont’d
• From our previous discussion
• Vi+1= Vi P
=
For the first raw of the matrix;
Pi+111 = .60Pi11 + .2pi12 Substituting …………………..(eq.4) in (eq.2)
Pi+112 = .40pi11 + .80pi12 Pi11= .33 and Pi12= .67
Once steady state is reached
Pi+111= pi11, Pi+1 12= pi12
P11= .6p11 + .2p12………………(1) For the second row of the matrix;
Pi+121 = .60Pi21 + .2pi22
P12= .4p11 + .8 p12…………………….(2) Pi+122 = .40pi21 + .80pi22
P11 + p12 = 1.0………………… (3) By solving the problem in similar manner for
row1, the values of raw 2 are:
P11 = 1.0 – p12 ………………………………(4)
P 21= .33 P 22= .67
i i
• Example3:inventory model
• A camera store stocks a particular model camera that can be
ordered weekly. Let D1, D2, … represent the demand for this
camera (the number of units that would be sold if the inventory
is not depleted) during the first week, second week, …,
respectively. It is assumed that the Di’s are independent and
identically distributed random variables having a Poisson
distribution with a mean of 1. Let X0 represent the number of
cameras on hand at the outset, X1 the number of cameras on
hand at the end of week 1, X2 the number of cameras on hand
at the end of week 2, and so on.
– Assume that X0 = 3.
– On Saturday night the store places an order that is delivered
in time for the next opening of the store on Monday.
– The store using the following order policy: If there are no
cameras in stock, 3 cameras are ordered. Otherwise, no
order is placed.
– Sales are lost when demand exceeds the inventory on hand
• Draw the transition diagram and show the
Inventory model continued
• A random variable X satisfies the Poisson Distribution if
• 1. The mean number of occurrences of the event, m, over a
fixed interval of time or space is a constant. This is called
the average characteristic. (This implies that the number of
occurrences of the event over an interval is proportional to
the size of the interval.)
• 2. The occurrence of the event over any interval is
independent of what happens in any other non-
overlapping interval.
If X follows a Poisson Distribution with mean m,
the probability of x events occurring is given
by the formula:
Inventory model Continued
• Xt is the number of Cameras in stock at the end
of week t (as defined earlier), where Xt
represents the state of the system at time t
• Given that Xt = i, Xt+1 depends only on Dt+1 and Xt
(Markovian property)
• A Poisson random variable can take an infinite
number of values. Since the sum of the
probabilities of all the outcomes is 1 and if, for
example, you require the probability of 2 or
more events, you may obtain this from the
identity P(X ≥ 2)= 1- p(0) – p(1)
Inventory model continued
• Dt has a Poisson distribution with mean equal
to one. This means that P(Dt+1 = n) = e-11n/n!
for n = 0, 1, …
• P(Dt = 0 ) = e-1 = 0.368
• P(Dt = 1 ) = e-1 = 0.368
• P(Dt = 2 ) = (1/2)e-1 = 0.184
• P(Dt  3 ) = 1 – P(Dt  2) = 1 – (.368 + .368
+ .184) = 0.08
• Xt+1 = max(3-Dt+1, 0) if Xt = 0 and Xt+1 = max(Xt –
D , 0) if X  1, for t = 0, 1, 2, ….
• For the first row of P, we are dealing with a transition
from state X t = 0 to some state Xt+1.
• As indicated at the above Xt+1 = max{3 – Dt+1, 0} if
Xt =0.
• Therefore, for the transition to Xt+1 = 3 or Xt+1 = 2 or
Xt+1 = 1,
• p03 = P{Dt+1 = 0} = 0.368,
• p02 = P{Dt+1 =1} = 0.368,
• p01 = P{Dt+1 = 2} = 0.184.
• A transition from Xt = 0 to Xt+1 = 0 implies that the
demand for cameras in week t +1 is 3 or more after 3
cameras are added to the depleted inventory at the
beginning of the week, so
• p00 = P{Dt+1 3} 0.080. For the other rows of P, the
formula at the end of Sec. 16.1 for the next state is
Xt1 max {Xt Dt1, 0} if Xt1 1. This implies that Xt1 Xt,
so p12 0, p13 0, and p23 0. For the other
transitions,
Inventory Example: (One-Step) Transition Matrix

P03 = P(Dt+1 = 0) = 0.368


0 1
P02 = P(Dt+1 = 1) = 0.368
P01 = P(Dt+1 = 2) = 0.184
P00 = P(Dt+1  3) = 0.080
2
3
0 1 2 3
0 p01 p02 p03 p04
1 p11 p12 p22 p23
2 p31 p32 p33 p34
3 p41 p42 p43 p44
Inventory Example: (One-Step) Transition Matrix

0 1 2 3
0 1
0 .080 .184 .368 .368
1 .632 .368 0 0
2 .264 .368 .368 0
2
3
3 .080 .184 .368 .368
Representation of a Markov Chain as a Digraph

A B C D
0.95 0.95 0 0.05 0
A
0.2 0.5 0 0.3
B
0.2 0.5
A B 0 0.2 0 0.8
C
0 0 1 0
0.05 0.2 0.3 D

0.8
C D
1

Each directed edge AB is associated with the


positive transition probability from A to B.
58
Example1 : The raw material is inspected and 99.8%
of the pieces will be accepted. 97% of the pieces
operated in machine 1 will be accepted. From
machine 2 and 3 there will be 95% and 98%
respectively.
a. Draw the transition diagram
b. Show the state transition probability matrix
c. Determine the amount of raw material that must
be purchased in order to produce 100 good
products
0.998 0.97 0.95
0.98

0.02 0.03 0.05


0.02
Ex 2: Suppose that an order has been received for 100 machined
parts. The sequence of manufacturing steps for each part is:
1. Machine 1 3. Machine 2 5. Machine 3
2. Inspection 1 4. Inspection 2 6. Inspection 3
7. Pack & Ship;
And the rate of reject and rework were given in the following table

a) Develop a
markovian
model for
the system
b) Find the
transition
matrix
• The transition diagram of the markov chain
To model the process as a Markov chain model, we
first define the states:
State Description
1 Machine 1
2 Inspection 1
3 Machine 2
4 Inspection 2
5 Machine 3
6 Inspection 3
7 Pack and Ship (absorbing)
8 Scrap Bin (absorbing)
Inventory Example: (One-Step) Transition Matrix
0 1 2 3
0 .080 .184 .368 .368
1 .632 .368 0 0
2 .264 .368 .368 0
P(2) = PP
3 .080 .184 .368 .368
0 1 2 3
0 .249 .286 .300 .165
P (2) 1 .283 .252 .233 .233
2 .351 .319 .233 .097
3 .249 .286 .300 .165
Transition Matrix: Four-Step

• P(4) = P(2)P(2) P(8) = P(4)P(4)


0 1 2 3 0 1 2 3
0 .289 .286 .261 .164 0 .286 .285 .264 .166
( 4 ) P (8) 
P 1 .282 .285 .268 .166 1 .286 .285 .264 .166
2 .286 .285 .264 .166
2 .284 .283 .263 .171
3 .286 .285 .264 .166
3 .289 .286 .261 .164
Steady-State Probabilities

• The steady-state probabilities uniquely satisfy the following


steady-state equations
s
 j    i pij for j 0, 1, 2, ......s
i 0
s
  j 1
j 0
• 0 = 0p00 + 1p10 + 2p20 + 3p30
• 1 = 0p01 + 1p11 + 2p21 + 3p31
• 2 = 0p02 + 1p12 + 2p22 + 3p32
• 3 = 0p03 + 1p13 + 2p23 + 3p33
• 1 = 0 + 1 + 2 + 3
Steady-State Probabilities: Inventory Example
0 1 2 3
0 .080 .184 .368 .368
1 .632 .368 0 0
2 .264 .368 .368 0
3 .080 .184 .368 .368
• 0 = .0800 + .6321 + .2642+ .0803
• 1 = .1840 + .3681 + .3682 + .1843
• 2 = .3680 + .3682 + .3683
• 3 = .3680 + .3683
• 1 = 0 + 1 + 2 + 3
• 0 = .286, 1 = .285, 2 = .263, 3 = .166
• The numbers in each row of matrix P(8) match the corresponding
steady-state probability
Special cases in markov chains
1. Transient state: A state is said to be transient if it is not possible to
move to that state from any other state except itself.

T =

 State 3 is a transient state. Once state 3 is achieved, the system will never

return to that state.

 Both states 1 and 2 contains a 0, 0 probability of going to state 3.

 The system will move out of state 3 to state 1 ( with a 1.0 probability) but

will never return to state 3.


Special cases in markov chains

2. Cycling Processes: a cycling ( or periodic) marcov chain process


is one in which transition matrix (T) contains all zero elements
in the diagonal elements of the matrix T and all other
elements are either 1 or 0.

T1 = T2 =

• There can be no steady-state conditions for such markov


chains
Special cases…cont’d
3. Absorbing state: a state is said to be absorbing ( trapping)
state if it does not leave that state. This situation will occur
if any transition probability in the diagonal matrix from
upper left to lower right is equal 1.

T=

 State 3 in this transition matrix is referred to as an absorbing

or trapping state. Once state 3 is achieved, there is a 1.0


probability that it will be achieved in succeeding time periods
Problem
3. A manufacturing firm has developed a transition matrix containing
the probabilities that a particular machine will operate or
breakdown in the following week, given its operating condition in
the present week.
Next week
This week Operate Break down
Operate  0 .4 0 .6 
 0 .8 0 .2 
 
Break down

a) Assuming that the machine is operating in week1, determine the


probabilities that the machine will operate or breakdown in week1,
2,3,4,5 and 6.
b) Determine the steady state probabilities for this transition and indicate
the percentage of future weeks in which the machine will breakdown.
Problem…cont’d
4. Given the following serial Manufacturing system
Incoming Finished
Turning Milling Drilling Boring
material products

Scrap

Table: summary of its operating characteristics


Incomin Turning Milling Drilling Boring
g
Material
Operation 25parts/hr 25parts/hr 25parts/hr 25parts/hr
Rate
Scrap .025% 2.5% 4% 1.5% 2.3%
Rate
Problem 4…cont’d

For the given information

a) Develop a markovian model for the system


b) Find the transition matrix
c) Find the probability that incoming material
becomes finished products
d) Find the probability that incoming material
becomes scrap

You might also like