0% found this document useful (0 votes)
7 views2 pages

hw2 2024

Stochastic processes 2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views2 pages

hw2 2024

Stochastic processes 2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Home Work 2

MTH 212M/ MTH 412A (2024)


Applied Stochastic Process - I

1. Suppose {Xn ; n ≥ 1} is a sequence of i.i.d. Bernoulli random variables, such that


P (X1 = 0) = p and P (X1 = 1) = 1 − p; 0 < p < 1.
Suppose Yn = min{M, X1 + . . . + Xn }, for n = 1, 2, . . .. Show that {Yn ; n ≥ 1} is a
Markov Chain, and find the transition probability P .
2. A urn contains B black balls and R red balls at the beginning. A ball is drawn at
random, and it is replaced by a ball with opposite color. If Xn denotes the number
of Black balls after n th draw, show that {Xn ; n ≥ 1} is a Markov Chain. Find the
transition probability matrix P .
3. Suppose A and B are two players playing a game with the initial amount of fortune
as Rs. 5 and Rs. 10, respectively. At each time they toss a fair coin. If Head appears,
A wins and gets Rs. 1 from B, otherwise A has to give Rs. 1 to B. The game stops
whenever either A or B gets all the money. Let Xn denote the amount of money A
has after the n-th toss. Show that {Xn ; n ≥ 1} is a Markov Chain. Find the transition
probability matrix. Based on computer simulation (a) Find the probability that A has
all the money when the game stops, (b) Find the expected duration of the game.
4. Customers arrive for service and take their place in a waiting line. During each period
of time a single customer is served, provided that at least one customer is present. If
no customer awaits service then during this period no service is performed. During
a service period new customers may arrive. It is assumed that the actual number
of arrivals in the n-th period is a random variable Zn , whose distribution function is
independent of the period and it is given by
P (Zn = k) = pk ; k = 0, 1, 2, . . . ,
P∞
where 0 ≤ pk ≤ 1, and k=0 = 1. It is further assumed that Z1 , Z2 , . . . are indepen-
dently distributed. If Xn denotes the number customers waiting in the line for service,
show that {Xn ; n ≥ 1} is a Markov Chain. Find the transition probability matrix P .
5. Suppose a coin is tossed indefinitely, and P (H) = p, where 0 < p < 1. If Head appears
we call it a success. We define a random variable Xn after n-th toss as follows: Xn = k,
if there is a run of k successes from the last failure, where k = 0, 1, 2, . . .. Show that
Xn is a Markov Chain, and find its transition probability matrix P .
6. Suppose an organism at the end of its lifetime produces a random number Y of offspring
with probability distribution

X
P (Y = k) = pk ; k = 0, 1, 2, . . . , pk ≥ 0, pk = 1.
k=0

1
It is assumed that all offspring act independently of each other, and at the end of
their lifetime (for simplicity, the life span of all organisms are assumed to be the same)
individually have progeny in accordance with the above probability distribution., thus
propagating their species. Let Xn denote the population size at the n-th generation.
Show that {Xn ; n ≥ 1} is a Markov Chain. If Y follow Binomial(10,1/2), find the
transition probability matrix P .

7. Show that if P is stochastic matrix, then P 2 is also a stochastic matrix. In fact P m ,


for any positive integer m is also a stochastic matrix.

8. If Q is a stochastic matrix,is it always possible to find a stochastic matrix P , such that


P 2 = Q?

You might also like