We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 3
INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR
END SEMESTER EXAMINATION
Date: 22-04-2009 Time: 3 hours Full Marks: 50
Spring Semester: 2008-09 Department: Mathematics No. of students: 191
Subject No. MA 20106 Subject Name: Probability and Stochastic Processes
Courses: IInd Year B.Tech, (EC/EE/IE/MT)
INSTRUCTIONS: Answer questions for 50 marks. Marks are indicated at the end of
each question. Mention the question numbers that you have attempted on the cover of the
answer script. Statistical tables may be used.
1. The marks obtained by the students in Mathematics, Physics and chemistry in an
examination are normally distributed with the means 52, 50 and 48 and with standard
deviations 10, 8 and 6 respectively. Find the probability that a student selected at
random has secured a total of (i) 180 or above and (ii) 135 or less. 2M
2. The percentage X of a particular compound contained in a rocket fuel follows a
normal distribution with mean 33 and standard deviation 3, though the specification
for X is that it should lie between 30 and 35. The manufacturer will get a net profit
(per unit of fuel) of Rs. 100, if 30 < X < 35, Rs. 50, if 25 c) = 5/16. 5M
6. The percentage scores X and Y in school and college examinations respectively, are
seen to follow a bivariate normal distribution with means 11 = 60, p12 = 75, 01 = 5,
on = 4, p = 0.8. Ifa student gets 70% in school, what is the probability of his/her
securing at least 80% in the college? What is the change in the answer, if p = - 0.8?
3M7. Prove that random variables X and Y are independent if and only if
Mg(s, t)=Mx(s) My(t) for all s and t for which both the sides exist. 2M
8. Consider throws of a fair die. At stage n, the system is in state By, ifj is the largest
number appearing in first n throws. Find the transition probability matrix for this
Markov chain. Classify the recurrent and transient states. 3M
9. Consider the four state Markov chain with states S = {1, 2, 3, 4} and the transition
probability matrix
1000
0
pa[? 9 PO)
0qO0>p
poqo
(a) If the initial state is 2, find the expected amount of time that the chain is in
@ state 3, (ii) state 4.
(b) Further starting in state 3, find the probability that the chain ever enters
@ state 2, (ii) state 4. 4M
10. A professor continuously gives exams to his students. He can give three possible
types of exams, and his class is graded as either having done well or badly. Let pj
denote the probability that the class does well on a type i exam, and suppose that p1 =
0.3, p2= 0.6 and p3 = 0.9. If the class does well on an exam, then the next exam is
equally likely to be any of the three types. If the class does badly, then the next exam
is always type i. Represent it as a Markov chain and find the long term proportions of
exam types. 3M
11. Three tanks target each other. Tank A hits its target with probability 2/3, tank B with
probability 4, and tank C with probability 1/3. Shots are fired simultaneously from all
live tanks and each tank fires at its strongest opponent (one with the highest
probability of success). Once a tank is hit, it is dead. Find the state space and
transition probability matrix. Further classify the states. 3M
12. Define a Markov chain. State and prove Chapman-Kolmogorov’ equations. Define
the relation of communication in a Markov chain. Prove that it is an equivalence
relation. 5M.
13, State the assumptions of a Poisson process. Hence derive the distribution of the
number of arrivals in a Poisson process with rate 4. Further derive the distribution of
the waiting time for the first occurrence in a Poisson process. 5M14. A spider hunting a fly moves between locations 1 and 2 according to a Markov chain
07 03
03 0.7,
unaware of the spider, starts in location 2 and moves according to a Markov chain
oe oa The spider catches the fly and the
hunt ends whenever they meet in the same location. Show that the progress of the
hunt, except for knowing the location where it ends, can be described by a three state
Markov chain. Obtain the transition probability matrix for this chain. Find the
probability that at time 2, the spider and the fly are both at their initial locations. What
is the average duration of the hunt? 4M
with transition probability matrix P, -| sing in location 1. The fly,
with transition probability matrix P, -[
15. Describe a two dimensional random walk as a Markov chain. Find equivalence
classes. Prove that all states are recurrent if the walk is symmetric. 3M
16. Each morning Madhu leaves her house and goes for a run. She is equally likely to
leave either from the front or the back door. Upon leaving the house, she chooses a
pair of running shoes (or goes running barefoot if there are no shoes at the door from
which she departed). On the return, she is equally likely to enter, and leave her
running shoes, either by the front or the back door. Describe the model as markov
chain and find transition probability matrix, if she owns a total k pairs of running
shoes. What proportion of times does she run barefoot? 5M
17. The jobs arrive at a processing center in accordance with a Poisson process with rate
1. The center has waiting space only for n jobs and so an arriving job finding N others
waiting goes away. At most 1 job per day can be processed, and the processing of this
job starts and the beginning of the day. Thus if there are any jobs waiting for
processing at the beginning of the day, then one of them is processed that day, and if
no jobs are waiting at the beginning of the day, then no jobs are processed that day.
Let X, denote the number of jobs at the center at the beginning of day n. Find
transition probability matrix of this Markov chain. Is this chain ergodic? Find the
stationary probabilities if N=3 and 2.=1. SoM