0% found this document useful (0 votes)
10 views

Module2-BCS301

Uploaded by

suhan99802804
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Module2-BCS301

Uploaded by

suhan99802804
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

MATHEMATICS FOR COMPUTER SCIENCE (BCS301) 2023

MODULE-2
Joint probability distribution: Joint Probability distribution for two discrete random variables, expectation,
covariance and correlation. Markov Chain: Introduction to Stochastic Process, Probability Vectors, Stochastic
matrices, Regular stochastic matrices, Markov chains, higher transition probabilities, Stationary distribution of
Regular Markov chains and absorbing states.

Joint probability distribution: 𝑃(𝑥, 𝑦) is called joint Probability function for two discrete
random variables 𝑋 and 𝑌 If i) 𝑃(𝑥, 𝑦) ≥ 0 for all 𝑥, 𝑦 ii) ∑𝑥 ∑𝑦 𝑃(𝑥, 𝑦) = 1.
Then [(𝑥, 𝑦), 𝑃(𝑥, 𝑦)] is called joint probability distribution.

Marginal distribution of 𝑋 : [𝑥, 𝑓(𝑥)] where 𝑓(𝑥) = ∑𝑦 𝑃(𝑥, 𝑦).

Marginal distribution of 𝑌 : [𝑦, 𝑔(𝑦)] where 𝑔(𝑦) = ∑𝑥 𝑃(𝑥, 𝑦).

𝐸(𝑋) = 𝜇𝑋 = ∑ 𝑥𝑓(𝑥) , 𝐸(𝑌) = 𝜇𝑌 = ∑ 𝑦𝑔(𝑦).

𝑉(𝑋) = ∑ 𝑥 2 𝑓(𝑥) − (𝜇𝑋 )2 , 𝑉(𝑌) = ∑ 𝑦 2 𝑓(𝑦) − (𝜇𝑌 )2 .

𝜎𝑋 = √𝑉(𝑋) 𝜎𝑌 = √𝑉(𝑌)

𝐸(𝑋𝑌) = ∑𝑥 ∑𝑦 𝑥𝑦𝑃(𝑥, 𝑦)

𝐶𝑜𝑣 (𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌)


𝐶𝑜𝑣 (𝑋,𝑌)
Correlation 𝜌(𝑋, 𝑌) = .
𝜎𝑋 𝜎𝑌

If 𝑃(𝑥, 𝑦) = 𝑓(𝑥)𝑔(𝑦) for all 𝑥, 𝑦 then 𝑋 and 𝑌 are independent.

1. A fair coin is tossed 3 times. Let 𝑋 denote 0 or 1 according as a head or tail occurs on the
first toss. Let 𝑌 denote the number of heads which occur. Find the joint distribution and
marginal distribution of 𝑋 and 𝑌 . Also find 𝐶𝑜𝑣 (𝑋, 𝑌).
Solution:
S HHH HHT HTH HTT THH THT TTH TTT
𝑥 0 0 0 0 1 1 1 1
𝑦 3 2 2 1 2 1 1 0

Joint distribution is
𝑌 0 1 2 3 Sum
𝑋
0 0 1 2 1 1
8 8 8 2

1 1 2 1 0 1
8 8 8 2

Sum 1 3 3 1
8 8 8 8

DEPARTMENT OF SCIENCE & HUMANITIES /C.E.C.


1
MATHEMATICS FOR COMPUTER SCIENCE (BCS301) 2023

𝑥 0 1
Marginal distribution of 𝑋 𝑓(𝑥) 1 1
2 2

𝑦 0 1 2 3
Marginal distribution of 𝑌 𝑔(𝑦) 1 3 3 1
8 8 8 8

2 2 1
𝐸(𝑋𝑌) = ∑𝑥 ∑𝑦 𝑥𝑦𝑃(𝑥, 𝑦) = + = ,
8 8 2
1
𝐸(𝑋) = ∑ 𝑥𝑓(𝑥) = ,
2
3 6 3 3
𝐸(𝑌) = ∑ 𝑦𝑔(𝑦) = + + = .
8 8 8 2
1 3 1
𝐶𝑜𝑣 (𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = − = − .
2 4 4

2. The joint probability distribution for two random variables 𝑋 and 𝑌 as follows

𝑋 −2 −1 4 5
𝑌
1 0.1 0.2 0 0.3

2 0.2 0.1 0.1 0


Determine i) marginal distribution of 𝑋 and 𝑌 ii) 𝐶𝑜𝑣 (𝑋, 𝑌) iii) Correlation of 𝑋 and 𝑌

Solution: Given that 𝑋 −2 −1 4 5 Sum


𝑌
1 0.1 0.2 0 0.3 0.6

2 0.2 0.1 0.1 0 0.4

Sum 0.3 0.3 0.1 0.3

Marginal distribution of 𝑋 Marginal distribution of 𝑌

𝑥 −2 −1 4 5 𝑦 1 2
𝑓(𝑥) 0.3 0.3 0.1 0.3 𝑔(𝑦) 0.6 0.4

𝐸(𝑋) = 𝜇𝑋 = ∑ 𝑥𝑓(𝑥) = 1, 𝐸(𝑌) = 𝜇𝑌 = ∑ 𝑦𝑔(𝑦) = 1.4.


𝑉(𝑋) = ∑ 𝑥 2 𝑓(𝑥) − (𝜇𝑋 )2 = 9.6, 𝑉(𝑌) = ∑ 𝑦 2 𝑓(𝑦) − (𝜇𝑌 )2 = 0.24.

DEPARTMENT OF SCIENCE & HUMANITIES /C.E.C.


2
MATHEMATICS FOR COMPUTER SCIENCE (BCS301) 2023

𝜎𝑋 = √𝑉(𝑋) = 3.098 , 𝜎𝑌 = √𝑉(𝑌) = 0.4899

𝐸(𝑋𝑌) = ∑𝑥 ∑𝑦 𝑥𝑦𝑃(𝑥, 𝑦) = 0.9

𝐶𝑜𝑣 (𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = −0.5


𝐶𝑜𝑣 (𝑋,𝑌)
Correlation 𝜌(𝑋, 𝑌) = = −0.3294.
𝜎𝑋 𝜎𝑌

3. The distribution of two independent variables 𝑋 and 𝑌 are as follows


𝑥 0 1 2 𝑦 1 2
𝑓(𝑥) 0.3 0.3 0.4 𝑔(𝑦) 0.3 0.7

Determine the joint distribution, verify that 𝐶𝑜𝑣 (𝑋, 𝑌) = 0.

Solution: Since 𝑋 and 𝑌 are independent, 𝑃(𝑥, 𝑦) = 𝑓(𝑥)𝑔(𝑦) for all 𝑥, 𝑦 .

The joint distribution is

𝑋 0 1 2 Sum
𝑌
1 0.09 0.09 0.12 0.3

2 0.21 0.21 0.28 0.7

Sum 0.3 0.3 0.4

𝐸(𝑋) = 𝜇𝑋 = ∑ 𝑥𝑓(𝑥) = 1.1, 𝐸(𝑌) = 𝜇𝑌 = ∑ 𝑦𝑔(𝑦) = 1.7.


𝐸(𝑋𝑌) = ∑𝑥 ∑𝑦 𝑥𝑦𝑃(𝑥, 𝑦) = 1.87
𝐶𝑜𝑣 (𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = 1.87 − 1.1 × 1.7 = 0 .
Exercise:
1. If 𝑋 and 𝑌 are two independent variables, 𝑋 take the values 1 and 2 with probabilities 0.7, 0.3
and 𝑌 take the values −2, 5, 8 with probabilities 0.3, 0.5, 0.2 respectively. Find the joint probability
distribution of 𝑋 and 𝑌 and 𝐶𝑜𝑣(𝑥, 𝑦).
2. The joint distribution of two discrete random variables X and Y is given by 𝑃(𝑥, 𝑦) = 𝑘(2𝑥 + 𝑦) ,
where 𝑥 and 𝑦 are integers such that 0 ≤ 𝑥 ≤ 2, 0 ≤ 𝑦 ≤ 3.
Determine i) 𝑘 ii) marginal distribution of 𝑋 and 𝑌 . iii) Check whether 𝑋 and 𝑌 are independent?
3. Two pens are selected at random from a box that contains 3 blue pens, 2 red pens and 3 green pens. If X is
the number of blue pens selected, and Y is the number of red pens selected,
then find i) Joint probability distribution ii) 𝑃(𝑥 + 𝑦 ≤ 1) .

DEPARTMENT OF SCIENCE & HUMANITIES /C.E.C.


3
MATHEMATICS FOR COMPUTER SCIENCE (BCS301) 2023

Stochastic process:
Stochastic Process: The process {𝑡, 𝑥𝑡 } is called stochastic Process. Where 𝑡 is the parameter
and 𝑥𝑡 are the values of the random variables called states.
1. Discrete state discrete parameter process.
Example: Number of telephone calls in different days in a telephone booth.
2. Discrete state continuous parameter process.
Example: Number of telephone calls in different time intervals in a telephone booth.
3. Continuous state discrete parameter process.
Example: Average duration of a telephone calls in different days in a telephone booth.
4. Continuous state continuous parameter process.
Example: Average duration of telephone calls in different time intervals in a telephone booth.

Probability vector: A vector 𝑣 = (𝑣1 , 𝑣2 , 𝑣3 , ⋯ ⋯ 𝑣𝑛 ) is said to be probability vector if


i) 𝑣𝑖 ≥ 0 for all 𝑖 and ii) ∑𝑛𝑖=1 𝑣𝑖 = 1.
1 1
Examples: ( 0.6, 0.4), (0.1, 0.3, 0.4, 0.2), (1, 0, 0), ( , 0, ) .
2 2

Stochastic matrices: A square matrix 𝑃 is said to be stochastic matrices if each row of P is a


probability vector.
0.3 0.2 0.5
0.3 0.7
Examples: [ 0 1 0] , [ ].
1 0
0.2 0.2 0.6
Regular stochastic matrices: A stochastic matrices 𝑃 is said to be regular if all the entries of
𝑃𝑘 are nonzero for some positive integer 𝑘.

Fixed point Or unique fixed probability vector: Let 𝑃 be regular stochastic matrices, and 𝑣
be a probability vector such that 𝑣𝑃 = 𝑣 , then 𝑣 is called unique fixed probability vector for 𝑃.

Marko-chain: Marko-chain is a discrete state discrete parameter process in which state space
is finite and the probability of any state depends at the most predecessor state.
Transition probability matrix of a Marko-chain is 𝑃 = 〈𝑝𝑖𝑗 〉 ,
Where 𝑝𝑖𝑗 is the probability of the transition of 𝑖th state to 𝑗th state.

Transient state of Marko-chain: A state 𝑎𝑖 is said to be transient if the chain may be in the state
𝑎𝑖 in some step but after the finite number of steps chain never comes back to the state 𝑎𝑖 .
Example: In modeling a computer program as a Marko-chain by considering every step as a
state, except the last step every step is a transient state.

DEPARTMENT OF SCIENCE & HUMANITIES /C.E.C.


4
MATHEMATICS FOR COMPUTER SCIENCE (BCS301) 2023

Absorbing state: A state 𝑎𝑖 is said to be absorbing if the chain once it reaches the state 𝑎𝑖 then it
remains in the same state.
Example: Consider the transition probability matrix of a Marko-chain with state space
(𝑎1 , 𝑎2 , 𝑎3 )
0.2 0.5 0.3
𝑃=[ 0 1 0 ] . Clearly the state 𝑎2 is absorbing.
0.5 0.5 0
Recurrent state: A state 𝑎𝑖 is said to be recurrent (or periodic) if the chain return to the state 𝑎𝑖
from the state 𝑎𝑖 in finite number of steps with probability 1. Minimum number of steps
required to return is called period.

Higher transition probability: Let 𝑃 be the transition probability matrix of a Marko-chain, and
𝑣0 be the initial probability vector of the chain.
Then after the first step the probability vector is 𝑣1 = 𝑣0 𝑃 ,
after the second step the probability vector is 𝑣2 = 𝑣1 𝑃 = 𝑣0 𝑃 2 ,
after the third step the probability vector is 𝑣3 = 𝑣2 𝑃 = 𝑣0 𝑃3 , and so on.
A Marko-chain is irreducible iff the transition probability matrix is regular.
Problems:
0 1 0
1. Show that 𝑃 = [1⁄6 1⁄2 1⁄3] is regular stochastic matrix and find the associated
0 2⁄3 1⁄3
unique fixed probability vector.
1⁄6 1⁄2 1⁄ 3
2
Solution: Since 𝑃 = [1⁄12 23⁄36 5⁄18] ,
1⁄9 5⁄9 1⁄ 3
all the entries of 𝑃2 are nonzero, hence 𝑃 is regular.

Let 𝑣 = (𝑥, 𝑦, 𝑧) be the fixed probability vector,


then 𝑣𝑃 = 𝑣, that is
0 1 0
(𝑥, 𝑦, 𝑧) [1⁄6 1⁄2 1⁄3] = (𝑥, 𝑦, 𝑧)
0 2⁄3 1⁄3

𝑥+𝑦+𝑧 =1 𝑥+𝑦+𝑧 =1
𝑦 𝑦 𝑥 = 0.1
⟹ =𝑥 ⟹ −𝑥 + = 0 ⟹ 𝑦 = 0.6 .
6 6
𝑦 2𝑧 𝑦 2𝑧
𝑥+ + =𝑦 𝑥− + =0 𝑧 = 0.3
2 3 2 3

∴ 𝑣 = (0.1, 0.6, 0.3) .


DEPARTMENT OF SCIENCE & HUMANITIES /C.E.C.
5
MATHEMATICS FOR COMPUTER SCIENCE (BCS301) 2023

2. Three boys A, B, C are throwing a ball to each other. A always throws ball to B, B always
throws the ball to C, but C is just as likely to throw the ball to B as to A. If C was the first
person to throw the ball, find the probability that after three throws i) A has the ball,
ii) B has the ball, iii) C has the ball.
Solution: Let a, b, c indicate the states ball is with A, B or C respectively.
𝑎 𝑏 𝑐
𝑎 0 1 0
Transition probability matrix is 𝑃 = 𝑏 1].
[ 0 0
𝑐 1⁄2 1⁄2 0
Since C was the first person to throw the ball, 𝑣0 = (0, 0, 1)
Then after the first throw the probability vector is 𝑣1 = 𝑣0 𝑃 = (1⁄2 , 1⁄2 , 0) ,
after the second throw the probability vector is 𝑣2 = 𝑣1 𝑃 = (0, 1⁄2 , 1⁄2) ,
after the third throw the probability vector is 𝑣3 = 𝑣2 𝑃 = (1⁄4 , 1⁄4 , 1⁄2).
Therefore after third throw probabilities that A has the ball, B has the ball, and C has the
ball are respectively 1⁄4 , 1⁄4 , 1⁄2 .
0 1 0
3. Show that 𝐴 = [ 0 0 1] is regular.
1⁄2 1⁄2 0
0 0 1
2
Solution: 𝐴 = [0.5 0.5 0 ],
0 0.5 0.5
0.5 0.5 0
3
𝐴 =[ 0 0.5 0.5],
0.25 0.25 0.5
0 0.5 0.5
4
𝐴 = [0.25 0.25 0.5 ]
0.25 0.5 0.25
0.25 0.25 0.5
𝐴5 = [ 0.25 0.5 0.25] .
0.125 0.375 0.5
Since the entries of 𝐴5 are all nonzero’s, therefore 𝐴 is regular.
4. A student’s study habits are as follows. If he studies one night, he is 70% sure not to study
the next night. On the other hand if he does not study one night, he is 60% sure not to study
the next night. In long run how often does he study?
Solution: Let S denote he study, N denote he does not study.
𝑆 𝑁
Transition probability matrix is 𝑃 = 𝑆 [0.3 0.7] .
𝑁 0.4 0.6
DEPARTMENT OF SCIENCE & HUMANITIES /C.E.C.
6
MATHEMATICS FOR COMPUTER SCIENCE (BCS301) 2023

Let 𝑣 = (𝑥, 𝑦) be the fixed probability vector, then 𝑣𝑃 = 𝑣, that is


0.3 0.7 𝑥+𝑦 =1
(𝑥, 𝑦) [ ] = (𝑥, 𝑦) ⟹
0.4 0.6 0.3𝑥 + 0.4𝑦 = 𝑥
4
𝑥+𝑦 =1 𝑥=
11
⟹ ⟹
−0.7𝑥 + 0.4𝑦 = 0 𝑦=
7
11
4
In long run he studies with the probability .
11
Exercise:
1. Show that following stochastic matrices are not regular.
1⁄2 1⁄4 1⁄4 1⁄2 1⁄2 0
i. [ 0 1 0 ] ii. [1⁄2 1⁄2 0 ]
1⁄2 0 1⁄2 1⁄4 1⁄4 1⁄2
2. Every year, a man trades his car for a new car. If he has a Maruti, he trades it for a Ford, If he has a Ford, he
trades it for Santro. However, if he has a Santro, he is just as likely to trade it for a new Santro as to trade it for a
Maruti or a Ford. In 2014 he bought his first car, which was a Santro.
Find the probability that he has a i) 2015 Santro, ii) 2016 Maruti , iii) 2017 Ford.
3. There are 2 white marbles in box A and 3 red marbles in box B. At each step of the process a marble is
selected from each box and the two marbles selected are interchanged. Let the state 𝑎𝑖 of the system is number
of 𝑖 red marbles in box A. a) Find the transition probability matrix. b) What is the probability that there are 2
red marbles in box A after 3 steps? c) In long run what is the probability that there are 2 red marbles in box A?

DEPARTMENT OF SCIENCE & HUMANITIES /C.E.C.


7

You might also like