0% found this document useful (0 votes)
40 views18 pages

BCS301 - Module 2

Notes

Uploaded by

ganesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views18 pages

BCS301 - Module 2

Notes

Uploaded by

ganesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Mathematics for Computer Science (III Semester)

(Subject code: BCS301)

Module 2: Joint probability distribution and Markov chain

__________________________________________________________________
Syllabus:

2.1 Joint Probability distribution


Introduction:

❖ Let 𝑋 = {𝑥1 , 𝑥2 , … , 𝑥𝑚 } and 𝑌 = {𝑦1 , 𝑦2 , … , 𝑦𝑛 } be two discrete random variables. Then


𝑃(𝑥, 𝑦) = 𝐽𝑖𝑗 is called joint probability function of X and Y if it satisfies the conditions:
𝑚 𝑛

(𝑖) 𝐽𝑖𝑗 ≥ 0 (𝑖𝑖) ∑ ∑ 𝐽𝑖𝑗 = 1


𝑖=1 𝑗=1
.
❖ Set of values of this joint probability function 𝐽𝑖𝑗 is called joint probability distribution of
X and Y.
X\Y 𝑦1 𝑦2 … 𝑦𝑛 𝑆𝑢𝑚
𝑥1 𝐽11 𝐽12 … 𝐽1𝑛 𝑓(𝑥1 )
𝑥2 𝐽21 𝐽22 … 𝐽2𝑛 𝑓(𝑥2 )
… … … … … …
𝑥𝑚 𝐽𝑚1 𝐽𝑚2 … 𝐽𝑚𝑛 𝑓(𝑥𝑚 )
𝑆𝑢𝑚 𝑔(𝑦1 ) 𝑔(𝑦2 ) … 𝑔(𝑦𝑛 ) 𝑇𝑜𝑡𝑎𝑙 = 1

❖ Marginal probability distribution of X


𝑥1 𝑥2 … 𝑥𝑛
𝑓(𝑥1 ) 𝑓(𝑥2 ) … 𝑓(𝑥𝑛 )

Where 𝑓(𝑥1 ) + 𝑓(𝑥2 ) + ⋯ + 𝑓(𝑥𝑛 ) = 1

Dr. Narasimhan G, RNSIT 1


❖ Marginal probability distribution of Y
𝑦1 𝑦2 … 𝑦𝑛
𝑔(𝑦1 ) 𝑔(𝑦2 ) … 𝑔(𝑦𝑛 )

Where 𝑔(𝑦1 ) + 𝑔(𝑦2 ) + ⋯ + 𝑔(𝑦𝑛 ) = 1


❖ The discrete random variables X and Y are said to be independent random variables if
𝑓(𝑥𝑖 )𝑔(𝑦𝑗 ) = 𝐽𝑖𝑗 .

Important results:
❖ Expectations:
𝑚 𝑛 𝑚 𝑛

𝐸(𝑥) = ∑ 𝑥𝑖 𝑓(𝑥𝑖 ) 𝐸(𝑦) = ∑ 𝑦𝑗 𝑔(𝑦𝑗 ) 𝐸(𝑥𝑦) = ∑ ∑ 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗


𝑖=1 𝑗=1 𝑖=1 𝑗=1

❖ Covariance:
𝐶𝑜𝑣(𝑥, 𝑦) = 𝐸(𝑥𝑦) − 𝐸(𝑥)𝐸(𝑦)

❖ Variance:
𝑉𝑎𝑟(𝑥) = 𝐸(𝑥 2 ) − [𝐸(𝑥)]2 𝑉𝑎𝑟(𝑦) = 𝐸(𝑦 2 ) − [𝐸(𝑦)]2

❖ Standard deviation:
𝜎𝑥 = √𝑉𝑎𝑟(𝑥) 𝜎𝑦 = √𝑉𝑎𝑟(𝑦)

❖ Correlation of X and Y:
𝐶𝑜𝑣(𝑥, 𝑦)
𝜌(𝑥, 𝑦) =
𝜎𝑥 𝜎𝑦

❖ If X and Y are independent then 𝐸(𝑥𝑦) = 𝐸(𝑥)𝐸(𝑦).

Dr. Narasimhan G, RNSIT 2


X\Y -4 2 7
1. The joint distribution of random variables X and Y is
1 1/8 1/4 1/8
5 1/4 1/8 1/8
Find (i) Marginal distribution of X and Y. (ii) 𝑬(𝒙), 𝑬(𝒚) (iii) Are X and Y
independent random variables? (iv) 𝑪𝒐𝒗(𝒙, 𝒚) (v) 𝝈𝒙 , 𝝈𝒚 (vi) 𝝆(𝒙, 𝒚)

𝑥\𝑦 −4 2 7 𝑓(𝑥)
1 1/8 1/4 1/8 1/2
5 1/4 1/8 1/8 1/2
𝑔(𝑦) 3/8 3/8 1/4 𝑇𝑜𝑡𝑎𝑙 = 1

(i) Marginal probability distribution of X: 𝑥 1 5


𝑓(𝑥) 1/2 1/2

Marginal probability distribution of Y: 𝑦 −4 2 7


1 1 𝑔(𝑦) 3/8 3/8 1/4
(ii) 𝐸(𝑥) = Σ𝑥𝑓(𝑥) = 1 (2) + 5 (2) = 3
3 3 1
𝐸(𝑦) = Σ𝑦𝑔(𝑦) = −4 (8) + 2 (8) + 7 (4) = 1

(iii) 𝐸(𝑥𝑦) = ΣΣ𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗


1 1 1 1 1 1
= 1(−4) (8) + 1(2) (4) + 1(7) (8) + 5(−4) (4) + 5(2) (8) + 5(7) (8)
4 4 7 10 35 3
= −8+8+8−5 + + =2
8 8

𝐸(𝑥)𝐸(𝑦) = 3(1) =3. Therefore, 𝐸(𝑥𝑦) ≠ 𝐸(𝑥)𝐸(𝑦).


Therefore, 𝑥 𝑎𝑛𝑑 𝑦 are not independent variables.
3 3
(iv) 𝐶𝑜𝑣 (𝑥, 𝑦) = 𝐸(𝑥𝑦) − 𝐸(𝑥)𝐸(𝑦) = 2 − 3(1) = − 2 .
1 1
(v) 𝐸(𝑥 2 ) = Σ𝑥 2 𝑓(𝑥) = 12 (2) + 52 (2) = 13
3 3 1 79
𝐸 (𝑦 2 ) = Σ𝑦 2 𝑔(𝑦) = (−4)2 (8) + 22 (8) + 72 (4) = 4

𝜎𝑥 = √𝐸(𝑥 2 ) − [𝐸(𝑥)]2 = √13 − 32 = 2


79
𝜎𝑦 = √𝐸(𝑦 2 ) − [𝐸(𝑦)]2 = √ 4 − 12 = 4.33

𝐶𝑜𝑣(𝑥,𝑦) 1.5
(vi) 𝜌(𝑥, 𝑦) = = − 8.66 = −0.1732
𝜎𝑥 𝜎𝑦

Dr. Narasimhan G, RNSIT 3


x y -3 2 4
2. Find the joint distribution of X and Y is as follows:
1 0.1 0.2 0.2
3 0.3 0.1 0.1

Find (i) Marginal distribution of X and Y. (ii) 𝑬(𝒙), 𝑬(𝒚) (iii) Are X and Y
independent random variables? (iv) 𝑪𝒐𝒗(𝒙, 𝒚) (v) 𝝈𝒙 , 𝝈𝒚 (vi) 𝝆(𝒙, 𝒚)
𝑥\𝑦 −3 2 4 𝑓(𝑥)
1 0.1 0.2 0.2 0.5
3 0.3 0.1 0.1 0.5
𝑔(𝑦) 0.4 0.3 0.3 𝑇𝑜𝑡𝑎𝑙 = 1
(i) Marginal probability distribution of X: 𝑥 1 3
𝑓(𝑥) 0.5 0.5

𝑦 −3 2 4
Marginal probability distribution of Y:
𝑔(𝑦) 0.4 0.3 0.3

(ii) 𝐸(𝑥) = Σ𝑥𝑓(𝑥) = 1(0.5) + 3(0.5) = 2

𝐸(𝑦) = Σ𝑦𝑔(𝑦) = −3(0.4) + 2(0.3) + 4(0.3) = 0.6

(iii) 𝐸(𝑥𝑦) = ΣΣ𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗 = 1(−3)(0.1) + 1(2)(0.2) + 1(4)(0.2)


+3(−3)(0.3) + 3(2)(0.1) + 3(4)(0.1)
= −0.3 + 0.4 + 0.8 − 2.7 + 0.6 + 1.2 = 0
𝐸(𝑥𝑦) = 0 , 𝐸(𝑥)𝐸(𝑦) = 2(0.6) = 1.2
Therefore, 𝐸(𝑥𝑦) ≠ 𝐸(𝑥)𝐸(𝑦).
Therefore, 𝑥 𝑎𝑛𝑑 𝑦 are not independent variables.
(iv) 𝐶𝑜𝑣 (𝑥, 𝑦) = 𝐸(𝑥𝑦) − 𝐸(𝑥)𝐸(𝑦) = 0 − 1.2 = −1.2
(v) 𝐸(𝑥 2 ) = Σ𝑥 2 𝑓(𝑥) = 12 (0.5) + 32 (0.5) = 5
𝐸(𝑦 2 ) = Σ𝑦 2 𝑔(𝑦) = (−3)2 (0.4) + 22 (0.3) + 42 (0.3) = 9.6
𝜎𝑥 = √𝐸(𝑥 2 ) − [𝐸(𝑥)]2 = √5 − 22 = 1

𝜎𝑦 = √𝐸(𝑦 2 ) − [𝐸(𝑦)]2 = √9.6 − 0.62 = 3.0397


𝐶𝑜𝑣(𝑥,𝑦) 1.2
(vi) 𝜌(𝑥, 𝑦) = = − 3.0397 = −0.3948
𝜎𝑥 𝜎𝑦

Dr. Narasimhan G, RNSIT 4


3. Find the joint distribution of X and Y which are the independent random variables
with the following respective distributions.

𝒙𝒊 1 2 𝒚𝒋 -2 5 8
𝒇(𝒙𝒊 ) 0.7 0.3 𝒈(𝒚𝒋 ) 0.3 0.5 0.2

Since X and Y are independent random variables, 𝐽𝑖𝑗 = 𝑓(𝑥𝑖 )𝑔(𝑦𝑗 )


Therefore,
𝑥\𝑦 −2 5 8 𝑓(𝑥)
1 0.21 0.35 0.14 0.7
2 0.09 0.15 0.06 0.3
𝑔(𝑦) 0.3 0.5 0.2 𝑇𝑜𝑡𝑎𝑙 = 1

4. Consider the joint distribution of X and Y. X Y 0 1 2 3


Compute the following probabilities: 0 0 1/8 1/4 1/8
(i) 𝑷(𝑿 = 𝟏, 𝒀 = 𝟐) (ii) 𝑷(𝑿 ≥ 𝟏, 𝒀 ≥ 𝟐) 1 1/8 1/4 1/8 0
(iii) 𝑷(𝑿 ≤ 𝟏, 𝒀 ≤ 𝟐) (iv) 𝑷(𝑿 + 𝒀 ≥ 𝟐) (v) 𝑷(𝑿 ≥ 𝟏, 𝒀 ≤ 𝟐).

(i) 𝑋 = {0, 1}, 𝑌 = {0, 1, 2, 3, 4}


1
𝑃(𝑋 = 1, 𝑌 = 2) = 𝑃(1, 2) = 8

(ii) If 𝑋 ≥ 1, 𝑋 = {1}. If 𝑌 ≥ 2, 𝑌 = {2, 3}


1 1
𝑃(𝑋 ≥ 1, 𝑌 ≥ 2) = 𝑃(1, 2) + 𝑃(1, 3) = 8 + 0 = 8

(iii) If 𝑋 ≤ 1, 𝑋 = {0, 1}. If 𝑌 ≤ 2, 𝑌 = {0, 1, 2}


𝑃(𝑋 ≤ 1, 𝑌 ≤ 2) = 𝑃(0, 0) + 𝑃(0, 1) + 𝑃(0, 2) + 𝑃(1, 0) + 𝑃(1, 1) + 𝑃(1, 2)
1 1 1 1 1 7
=0+8+4+8+4+8=8

(iv) If 𝑋 + 𝑌 ≥ 2 then 𝑋 + 𝑌 = 0 + 2 𝑜𝑟 0 + 3 𝑜𝑟 1 + 1 𝑜𝑟 1 + 2 𝑜𝑟 1 + 3
𝑃(𝑋 + 𝑌 ≥ 2) = 𝑃(0, 2) + 𝑃(0, 3) + 𝑃(1, 1) + 𝑃(1, 2) + 𝑃(1, 3)
1 1 1 1 3
=4+8+4+8+0= 4

(v) If 𝑋 ≥ 1, 𝑋 = {1}. If 𝑌 ≤ 2, 𝑌 = {0, 1, 2}


1 1 1 1
𝑃(𝑋 ≥ 1, 𝑌 ≤ 2) = 𝑃(1, 0) + 𝑃(1, 1) + 𝑃(1, 2) = 8 + 4 + 8 = 2

Dr. Narasimhan G, RNSIT 5


5. A fair coin is tossed thrice. The random variables X and Y are defined as follows:
X=0 or 1 according as head or tail occur in the first toss. Y = Number of heads.
Determine (i) The distribution of X and Y. (ii) The joint distribution of X and Y.(iii)
The expectations of X and Y (iv) Standard deviation of X and Y (v) Covariance of X
and Y (vi) Correlation of X and Y.

𝑆 = {𝐻𝐻𝐻, 𝐻𝐻𝑇, 𝐻𝑇𝐻, 𝐻𝑇𝑇, 𝑇𝐻𝐻, 𝑇𝐻𝑇, 𝑇𝑇𝐻, 𝑇𝑇𝑇}


𝑋 = {0, 1} 𝑎𝑛𝑑 𝑌 = {0, 1, 2, 3}
(i) Marginal distribution of X: Marginal distribution of Y:

0 1 0 1 2 3
1/2 1/2 1/8 3/8 3/8 1/8

(ii) 𝐽00 = 𝑃(𝐹𝑖𝑟𝑠𝑡 𝑡𝑎𝑖𝑙, 𝑛𝑜 ℎ𝑒𝑎𝑑𝑠) = 1/8


𝐽01 = 𝑃(𝐹𝑖𝑟𝑠𝑡 𝑡𝑎𝑖𝑙, 1 ℎ𝑒𝑎𝑑) = 2/8
𝐽02 = 𝑃(𝐹𝑖𝑟𝑠𝑡 𝑡𝑎𝑖𝑙, 2 ℎ𝑒𝑎𝑑𝑠) = 1/8
𝐽03 = 𝑃(𝐹𝑖𝑟𝑠𝑡 𝑡𝑎𝑖𝑙, 3 ℎ𝑒𝑎𝑑𝑠) = 0
𝐽10 = 𝑃(𝐹𝑖𝑟𝑠𝑡 ℎ𝑒𝑎𝑑, 𝑛𝑜 ℎ𝑒𝑎𝑑𝑠) = 0
𝐽11 = 𝑃(𝐹𝑖𝑟𝑠𝑡 ℎ𝑒𝑎𝑑, 1 ℎ𝑒𝑎𝑑) = 1/8
𝐽12 = 𝑃(𝐹𝑖𝑟𝑠𝑡 ℎ𝑒𝑎𝑑, 2 ℎ𝑒𝑎𝑑𝑠) = 2/8
𝐽13 = 𝑃(𝐹𝑖𝑟𝑠𝑡 ℎ𝑒𝑎𝑑, 3 ℎ𝑒𝑎𝑑𝑠) = 1/8
X Y 0 1 2 3
The joint distribution of X and Y: 0 1/8 2/8 1/8 0
1 0 1/8 2/8 1/8

(iii) 𝐸(𝑥) = Σ𝑥𝑓(𝑥) = 0(1/2) + 1(1/2) = 1/2


𝐸(𝑦) = Σ𝑦𝑔(𝑦) = 0(1/8) + 1(3/8) + 2(3/8) + 3(1/8) = 3/2

(iv) 𝐸(𝑥 2 ) = Σ𝑥 2 𝑓(𝑥) = 02 (1/2) + 12 (1/2) = 1/2


𝐸(𝑦 2 ) = Σ𝑦 2 𝑔(𝑦) = 02 (1/8) + 12 (3/8) + 22 (3/8) + 32 (1/8) = 3
𝜎𝑥 = √𝐸(𝑥 2 ) − [𝐸(𝑥)]2 = √1/2 − (1/2)2 = 1/2

𝜎𝑦 = √𝐸(𝑦 2 ) − [𝐸(𝑦)]2 = √3 − (3/2)2 = √3/2


1 2 1
(v) 𝐸(𝑋𝑌) = 0 + 0 + 0 + 0 + 0 + 1(1) (8) + 1(2) (8) + 1(3) (8) = 1
1 3 1
Covariance of X and Y: 𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = 1 − (2) (2) = 4
𝐶𝑜𝑣(𝑋,𝑌) 1 2 1
(vi) Correlation of X and Y: 𝜌(𝑋, 𝑌) = = 4×2× = = 1.7321
𝜎𝑥 𝜎𝑦 √3 √3

Dr. Narasimhan G, RNSIT 6


6. The joint probability distribution of two discrete random variables X and Y is given
by 𝒇(𝒙, 𝒚) = 𝒌(𝟐𝒙 + 𝒚) for 𝟎 ≤ 𝒙 ≤ 𝟐 , 𝟎 ≤ 𝒚 ≤ 𝟑. (i) Find the value of 𝒌. (ii) The
marginal distribution of X and Y (iii) Show that X and Y are dependent.

By data, 𝑋 = {0, 1, 2} and 𝑌 = {0, 1, 2, 3}


𝑓(𝑥, 𝑦) = 𝑘(2𝑥 + 𝑦)
The joint probability distribution of X and Y:
X Y 0 1 2 3 𝑓(𝑋)
0 0 𝑘 2𝑘 3𝑘 6𝑘
1 2𝑘 3𝑘 4𝑘 5𝑘 14𝑘
2 4𝑘 5𝑘 6𝑘 7𝑘 22𝑘
g(Y) 6𝑘 9𝑘 12𝑘 15𝑘 42k

(i) Find the value of 𝑘:


1
1= Σ𝑓(𝑥, 𝑦) = 42𝑘 , 𝑘 = 42

(ii) Marginal probability distribution of X: 0 1 2


6/42 14/42 22/42

0 1 2 3
Marginal probability distribution of Y: 6/42 9/42 12/42 15/42

(iii) 𝐽𝑖𝑗 = 𝑓(𝑥𝑖 , 𝑦𝑗 ) = 𝑓(0, 1) = 𝑘

𝑓(𝑥𝑖 ) × 𝑔(𝑦𝑗 ) = 𝑓(0) × 𝑔(1) = 6𝑘 × 9𝑘

𝐽𝑖𝑗 ≠ 𝑓(𝑥𝑖 ) × 𝑔(𝑦𝑗 ). Therefore, X and Y are dependent.

Dr. Narasimhan G, RNSIT 7


7. The joint probability distribution of X and Y is given by 𝒇(𝒙, 𝒚) = 𝒄(𝒙𝟐 + 𝒚𝟐 ) for
𝒙 = −𝟏, 𝟎, 𝟏, 𝟑. 𝒂𝒏𝒅 𝒚 = −𝟏, 𝟐, 𝟑. (i) Find the value of 𝒄. (ii) 𝑷(𝒙 = 𝟎, 𝒚 ≤ 𝟐) (iii)
𝑷(𝒙 ≤ 𝟏, 𝒚 > 𝟐) (iv) 𝑷(𝒙 ≥ 𝟐 − 𝒚) .

By data, 𝑋 = {−1, 0, 1, 3} and 𝑌 = {−1, 2, 3}. 𝑓(𝑥, 𝑦) = 𝑐(𝑥 2 + 𝑦 2 )


The joint probability distribution of X and Y: X Y -1 2 3 𝑓(𝑋)
−1 2c 5𝑐 10𝑐 17𝑐
0 𝑐 4𝑐 9𝑐 14𝑐
1 2𝑐 5𝑐 10𝑐 17𝑐
3 10𝑐 13𝑐 18𝑐 41𝑐
1
g(Y) 15𝑐 27𝑐 47𝑐 89c
(i) Find 𝒄: 1= Σ𝑓(𝑥, 𝑦) = 89𝑐 , 𝑐 = 89

(ii) 𝑥 = 0, 𝑦 = {−1, 2}

𝑃(𝑥 = 0, 𝑦 ≤ 2) = 𝑃(0, −1) + 𝑃(0, 2) = 𝑐 + 4𝑐 = 5𝑐 = 5/89

(iii) 𝑥 = {−1, 0, 1}, 𝑦 = {3}

𝑃(𝑥 ≤ 1, 𝑦 > 2) = 𝑃(−1, 3) + 𝑃(0, 3) + 𝑃(1, 3)

= 10𝑐 + 9𝑐 + 10𝑐 = 29𝑐 = 29/89

(iv) 𝑃(𝑥 ≥ 2 − 𝑦) = 𝑃(𝑥 + 𝑦 ≥ 2)

= 𝑃(−1, 3) + 𝑃(0, 2) + 𝑃(0, 3) + 𝑃(1, 2) + 𝑃(1, 3)

+𝑃(3, −1) + 𝑃(3, 2) + 𝑃(3, 3)

= 10𝑐 + 4𝑐 + 9𝑐 + 5𝑐 + 10𝑐 + 10𝑐 + 13𝑐 + 18𝑐


79
= 79𝑐 = 89

Home work:

8. Two cards are selected from a box which contains 5 cards numbered 1, 1, 2, 2, 3. Find
the joint distribution of X and Y, where X denote the sum and Y denote the maximum of
two numbers drawn. Also determine 𝐶𝑜𝑣(𝑥, 𝑦).

9. The joint distribution of random variables X and Y is X\Y 1 3 6


Find Marginal distribution of X and Y. 1 1/9 1/6 1/18
Are X and Y independent random variables? 3 1/6 1/4 1/12
6 1/18 1/12 1/36
1 1 1
10. X and Y are independent random variables. X takes values 2, 5, 7 with probabilities , ,
2 4 4
1 1 1
respectively. Y take values 3, 4, 5 with probabilities , , .
3 3 3
(i) Find the joint probability
distribution of X and Y. (ii) Find covariance of X and Y (iii) Find the probability distribution of
𝑍 =𝑋+𝑌

Dr. Narasimhan G, RNSIT 8


2.2 Probability vector and stochastic matrix
Introduction:

❖ A vector 𝑉 = (𝑣1 , 𝑣2 , … , 𝑣𝑛 ) is called a probability vector if each one of its components


are non negative and their sum is equal to 1.
1 1 1 1 1
Example: 𝑢 = (1,0), 𝑣 = (2 , 2) , 𝑤 = (4 , 4 , 2)
❖ A square matrix P having every row in the form of a probability vector is called
Stochastic matrix.
1 1
0
1 0 0 1 1 2 2
1 0 1 1
Example: ( ) , (0 1 0) , ( 2 2) , 0
0 1 0 1 2 2
0 0 1 1 1
0 2
(2 )
❖ A stochastic matrix is said to be regular stochastic matrix if all the entries of some power
𝑃𝑛 is positive.A stochastic matrix is not regular if 1 occurs in the principal diagonal.

❖ Regular Stochastic matrix P has a fixed probability vector


𝑥
𝑉 = (𝑣1 , 𝑣2 , … , 𝑣𝑛 ) such that 𝑉𝑃 = 𝑉. Where 𝑣𝑖 = 𝑖 . ∑ 𝑥𝑖
------------------------------------------------------------------------------------------------------------
1. Which of the following are probability vectors?
1 1 1 1 1 1 3 1 1 5 8 1 1
(i)(1, 0) (ii) ( , ) (iii) ( , , ) (iv) ( , , − , ) (v) ( , 0, , , ), (vi) (3,0,2)
2 2 4 4 2 4 2 4 2 2 3 6 6
Ans : yes, yes, yes, no, no, no.
------------------------------------------------------------------------------------------------------------
2. Which of the following are stochastic matrices?
0 1 0 1 0 0 1 0 1
(i) ( )(ii) ( ) (iii) ( ) (iv) ( 1 3 )
1/2 1/4 1/4 0 −1 1/2 1/2 2 2
1 3
Ans: no (not square matrix), no (0 − 1 ≠ 1), yes, no (2 + 2 ≠ 1).
------------------------------------------------------------------------------------------------------------
𝟏/𝟐 𝟏/𝟒 𝟏/𝟒
3. Which of the stochastic matrices are regular? (i) ( 𝟎 𝟏 𝟎 )
𝟏/𝟐 𝟎 𝟏/𝟐
𝟏 𝟏
𝟐 𝟐
𝟎 𝟎 𝟎 𝟏 𝟎 𝟏 𝟎 𝟎 𝟎 𝟏
𝟏 𝟏 𝟏 𝟏 𝟏) (v) ( 𝟏 𝟏 𝟏
(ii) 𝟎 (iii) ( 𝟐 𝟎 ) (iv) (𝟎 𝟎 )
𝟐 𝟐 𝟐 𝟏 𝟏 𝟐 𝟒 𝟒
𝟏 𝟏 𝟏 𝟎 𝟏 𝟎 𝟎 𝟎 𝟏 𝟎
𝟐 𝟐
(𝟒 𝟒 𝟐)

(i) no (1 lies in the main diagonal)


(i) no (𝑎13 , 𝑎23 𝑎𝑟𝑒 𝑧𝑒𝑟𝑜 𝑖𝑛 𝐴, 𝐴2 , 𝐴3 , …)
(ii) Yes, All entries of 𝑃5 are positive.
(iii) Yes, All entries of 𝑃5 are positive.
(iv) Yes, All entries of 𝑃3 are positive.
-----------------------------------------------------------------------------------------------------------

Dr. Narasimhan G, RNSIT 9


4. Find the unique fixed probability vector of the following regular stochastic

𝟑/𝟒 𝟏/𝟒 𝟐/𝟑 𝟏/𝟑 𝟎. 𝟐 𝟎. 𝟖


matrices: (i) ( ) (ii) ( ) (iii) ( )
𝟏/𝟐 𝟏/𝟐 𝟐/𝟓 𝟑/𝟓 𝟎. 𝟓 𝟎. 𝟓

3 1

(i) Let 𝑃 = (41 4


1) and 𝑉 be the unique fixed probability vector.
2 2

To find: 𝑉
𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦), 𝑥 + 𝑦 = 1.
3/4 1/4
(𝑥 𝑦) ( ) = (𝑥 𝑦), 𝑥 + 𝑦 = 1.
1/2 1/2
3𝑥 𝑦 𝑥 𝑦
( + + ) = (𝑥 𝑦), 𝑥 + 𝑦 = 1.
4 2 4 2
3𝑥 𝑦 2 1
Solve + 2 = 𝑥 𝑎𝑛𝑑 𝑥 + 𝑦 = 1, we get 𝑥 = 3 , 𝑦 = 3
4

Therefore, 𝑉 = (2/3 1/3).

2/3 1/3
(ii) Let 𝑃 = ( ) and 𝑉 be the unique fixed probability vector.
2/5 3/5

To find: 𝑉

𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦), 𝑥 + 𝑦 = 1.
2/3 1/3
(𝑥 𝑦) ( ) = (𝑥 𝑦), 𝑥 + 𝑦 = 1.
2/5 3/5
2𝑥 2𝑦 𝑥 3𝑦
( + + ) = (𝑥 𝑦), 𝑥 + 𝑦 = 1.
3 5 3 5
2𝑥 2𝑦 6 5
Solve + = 𝑥, 𝑥 + 𝑦 = 1 we get 𝑥 = 11 , 𝑦 = 11
3 5

Therefore, 𝑉 = (6/11 5/11)


0.2 0.8
(iii) Let 𝑃 = ( ) and 𝑉 be the unique fixed probability vector.
0.5 0.5
To find: 𝑉
𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦), 𝑥 + 𝑦 = 1.

(𝑥 𝑦) (0.2 0.8) = (𝑥 𝑦), 𝑥 + 𝑦 = 1.


0.5 0.5
(0.2𝑥 + 0.5𝑦 0.8𝑥 + 0.5𝑦) = (𝑥 𝑦), 𝑥 + 𝑦 = 1.
Solve 0.2𝑥 + 0.5𝑦 = 𝑥, 𝑥 + 𝑦 = 1 we get 𝑥 = 5/13, 𝑦 = 8/13
Therefore, 𝑉 = (5/13 8/13)

Dr. Narasimhan G, RNSIT 10


5. Find the unique fixed probability vector of the following regular stochastic matrices:
𝟎 𝟏/𝟐 𝟏/𝟐 𝟎 𝟏 𝟎 𝟎 𝟏 𝟎
(𝟏/𝟑 𝟐/𝟑 𝟎 ) , ( 𝟏/𝟔 𝟏/𝟐 𝟏/𝟑) , (𝟏/𝟐 𝟎 𝟏/𝟐)
𝟎 𝟏 𝟎 𝟎 𝟐/𝟑 𝟏/𝟑 𝟏/𝟐 𝟏/𝟒 𝟏/𝟒
0 1/2 1/2
(i) Let 𝑃 = (1/3 2/3 0 ) and 𝑉 be the unique fixed probability vector.
0 1 0
To find: 𝑉
𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
0 1/2 1/2
(𝑥 𝑦 𝑧) (1/3 2/3 0 ) = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
0 1 0
𝑦 𝑥 2𝑦 𝑥
( + +𝑧 ) = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
3 2 3 2
𝑦 𝑥
Solve 3 = 𝑥, = 𝑧 𝑎𝑛𝑑 𝑥 + 𝑦 + 𝑧 = 1.
2
2 6 1
We get 𝑥 = 9 , 𝑦 = 9 , 𝑧 = 9
2 6 1
Therefore, 𝑉 = (9 9 9
)

0 1 0
(ii) Let 𝑃 = (1/6 1/2 1/3) and 𝑉 be the unique fixed probability vector.
0 2/3 1/3
To find: 𝑉
𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
0 1 0
(𝑥 𝑦 𝑧) (1/6 1/2 1/3) = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
0 2/3 1/3
𝑦 𝑦 𝑦 𝑧
( 𝑥+ +𝑧 + ) = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
6 2 3 3
𝑦 𝑦 𝑧
Solve = 𝑥, + 3 = 𝑧 𝑎𝑛𝑑 𝑥 + 𝑦 + 𝑧 = 1.
6 3
1 6 3
We get 𝑥 = 10 , 𝑦 = 10 , 𝑧 = 10

Therefore, 𝑉 = (1/10 6/10 3/10)

Dr. Narasimhan G, RNSIT 11


0 1 0
(iii) Let 𝑃 = (1/2 0 1/2) and 𝑉 be the unique fixed probability vector.
1/2 1/4 1/4
To find: 𝑉
𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
0 1 0
(𝑥 𝑦
𝑧) (1/2 0 1/2) = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
1/2 1/4 1/4
𝑦 𝑧 𝑧 𝑦 𝑧
( + 𝑥+ + ) = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
2 2 4 2 4
𝑦 𝑧 𝑧
Solve + 2 = 𝑥, 𝑥 + 4 = 𝑦 𝑎𝑛𝑑 𝑥 + 𝑦 + 𝑧 = 1.
2
5 6 4
We get 𝑥 = 15 , 𝑦 = 15 , 𝑧 = 15.

Therefore, 𝑉 = (5/15 6/15 4/15)


--------------------------------------------------------------------------------------------------------

2.3 Markov Chain


Introduction:

❖ A stochastic process which is such that the generation of the probability distribution
depend only on the present state is called a Markov process.
❖ If this state space is discrete, then Markov process is called Markov chain.
❖ Transition probability matrix of a Markov chain is a Stochastic matrix.
❖ n step transition matrix of 𝑃 = 𝑃𝑛 .
❖ A Markov chain is said to be regular if the associated transition matrix is regular.
❖ If a transition probability matrix is regular, then it is irreducible.
❖ Unique fixed probability vector is also called as stationary probability vector.
❖ In the long run, we get a stationary probability vector.
--------------------------------------------------------------------------------------------------------

Dr. Narasimhan G, RNSIT 12


𝟐 𝟏
𝟎 𝟑 𝟑
𝟏 𝟏
1. Prove that the Markov chain whose t.p.m. 𝑷 = 𝟎 is irreducible. Also find
𝟐 𝟐
𝟏 𝟏
𝟎
(𝟐 𝟐 )
the corresponding stationary probability vector.

18 6 12
2 1
𝑃 = 36 ( 9 21 6 )
9 12 15

∴ 𝑃 is regular ⇒ 𝑃 is irreducible.

To find: 𝑉

𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
2 1
0
3 3
1 1
(𝑥 𝑦 𝑧 ) 0 = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
2 2
1 1
(2 2 0)
𝑦 𝑧 2𝑥 𝑧 𝑥 𝑦
( + + + ) = (𝑥 𝑦 𝑧), 𝑧 = 1 − 𝑥 − 𝑦.
2 2 3 2 3 2
𝑦 𝑧 2𝑥 𝑧 𝑥 𝑦
+ = 𝑥, + = 𝑦, + = 𝑧 𝑎𝑛𝑑 𝑧 = 1 − 𝑥 − 𝑦.
2 2 3 2 3 2
1 10 8 1 10 8
By solving we get 𝑥 = 3 , 𝑦 = 27 , 𝑧 = 27. Therefore, 𝑉 = (3 27 27
).

This is the required stationary probability vector.


--------------------------------------------------------------------------------------------------------

Dr. Narasimhan G, RNSIT 13


2. Three boys A, B, and C throwing ball to each other. A always throws the ball to B
and B always throws the ball to C. C is just as likely to throw the ball to B as to A. If
C was the first person to throw the ball find the probabilities that after three throws
(i) A has the ball (ii) B has the ball (iii) C has the ball.

State space is {𝐴, 𝐵, 𝐶}. 𝐴 𝐵 𝐶


𝐴 0 1 0
Transition probability matrix is 𝑃 = 𝐵 (01 01 1)
𝐶 2 2 0
After three throws, 3 step transition matrix
1 1
3 0
0 1 0 2 2
0 0 1 1 1
𝑃3 = (1 1 ) = 0
0 2 2
2 2 1 1 1
(4 4 2)
Conclusion: Initially C has the ball. Therefore, After three throws,

1
(i) 𝑃(𝐴 ℎ𝑎𝑠 𝑡ℎ𝑒 𝑏𝑎𝑙𝑙) = 4
1
(ii) 𝑃(𝐵 ℎ𝑎𝑠 𝑡ℎ𝑒 𝑏𝑎𝑙𝑙) = 4
1
(iii) 𝑃(𝐶 ℎ𝑎𝑠 𝑡ℎ𝑒 𝑏𝑎𝑙𝑙) = 2
--------------------------------------------------------------------------------------------------------

Dr. Narasimhan G, RNSIT 14


3. Every year a man trade his car for a new car. If he has a ‘Suzuki’, he trades it for
an ‘Hyundai ’. If he has an ‘Tata’, he trades it for a ‘Hyundai’. If he has a
‘Hyundai’, he is just as likely to trade it for a new ‘Hyundai’ as to trade it for a
‘Suzuki’ or an ‘Tata’. In 2000 he bought his first car which was a ‘Hyundai’.
(i)Find the probability that he has (a) 2002 Hyundai (b) 2002 Suzuki (c) 2003 Tata
(d) 2003 Hyundai. (ii) In a long run, how often will he has a Hyundai?
Case (i)
Let A: Suzuki, B: Tata r, C: Hyundai
State space is {𝐴, 𝐵, 𝐶}.
Transition probability matrix is
𝐴 𝐵 𝐶
𝐴 0 1 0
𝑃 = 𝐵 (01 0 1).
1 1
𝐶 3 3 3

After two years, 2 step transition matrix


0 0 1
0 1 0 2 1 1 1
0 0 1
𝑃 2 = (1 1 1) = 3 3 3
1 4 4
3 3 3
(9 9 9)
After three years, 3 step transition matrix
1 1 1
0 1 0 3 3 3 3
1 4 4
𝑃3 = (01 01 11 ) = 9 9 9
3 3 3 4 7 16
(27 27 27)

Conclusion: Initially he has Hyundai. Therefore,


(i) P (he has Hyundai in 2002) = 4/9
(ii) P(he has Suzuki in 2002) = 1/9
(iii) P(he has Tata in 2003) = 7/27
(iv) P(he has Hyundai in 2003) = 16/27

Dr. Narasimhan G, RNSIT 15


Case (ii)
To find: 𝑉

𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
0 1 0
0 0 1
(𝑥 𝑦 𝑧) (1 1 1) = (𝑥 𝑦 𝑧), 𝑥 + 𝑦 + 𝑧 = 1.
3 3 3
𝑧 𝑧 𝑧
( 𝑥+ 𝑦 + ) = (𝑥 𝑦 𝑧), 𝑧 = 1 − 𝑥 − 𝑦.
3 3 3
𝑧 𝑧 𝑧
= 𝑥, 𝑥 + = 𝑦, 𝑦 + = 𝑧 𝑎𝑛𝑑 𝑧 = 1 − 𝑥 − 𝑦.
3 3 3
𝑧 = 3𝑥, 𝑦 = 2𝑥, 𝑧 = 1 − 2𝑥 − 3𝑥.
1 1 1
By solving we get 𝑥 = 6 , 𝑦 = 3 , 𝑧 = 2.
1 1 1
Therefore, 𝑉 = (6 3
)
2

Conclusion: In the long run, Probability that he has a Hyundai = 1/2.


--------------------------------------------------------------------------------------------------------

Dr. Narasimhan G, RNSIT 16


4. A student’s study habits are as follows: If he studies one night, he is 70% sure not
to study the next night. On the other hand if he does not study one night, 60% sure
not to study next night. In the long run how often does he study?
Let A: Studying, B: Not studying
State space is {𝐴, 𝐵}. 𝐴 𝐵
𝐴 0.3 0.7
Transition probability matrix is 𝑃 = ( )
𝐵 0.4 0.6
To find: 𝑉
𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦), 𝑥 + 𝑦 = 1.

(𝑥 𝑦) (0.3 0.7) = (𝑥 𝑦), 𝑥 + 𝑦 = 1.


0.4 0.6
(0.3𝑥 + 0.4𝑦 0.7𝑥 + 0.6𝑦) = (𝑥 𝑦), 𝑦 = 1 − 𝑥.
0.3𝑥 + 0.4(1 − 𝑥) = 𝑥, 𝑦 = 1 − 𝑥
4 7 4 7
By solving we get 𝑥 = 11 , 𝑦 = 11. Therefore, 𝑉 = (11 11
)

Conclusion: In the long run, 4/11 of the times he studies.


--------------------------------------------------------------------------------------------------------

Dr. Narasimhan G, RNSIT 17


5. A man’s smoking habits are as follows. If he smokes filter cigarettes one week, he
switches to non filter cigarettes the next week with probability 0.2. On the other
hand if smokes non filter cigarettes one week, there is a probability of 0.7 that he
will smoke non filter cigarettes the next week as well. In the long run how often
does he smoke filter cigarettes?
Let A: Filter cigarettes, B: Non filter cigarettes
State space is {𝐴, 𝐵}. 𝐴 𝐵
𝐴 0.8 0.2
Transition probability matrix is 𝑃 = ( )
𝐵 0.3 0.7
To find: 𝑉
𝑉𝑃 = 𝑉, 𝑤ℎ𝑒𝑟𝑒 𝑉 = (𝑥 𝑦), 𝑥 + 𝑦 = 1.

(𝑥 𝑦) (0.8 0.2) = (𝑥 𝑦), 𝑥 + 𝑦 = 1.


0.3 0.7
(0.8𝑥 + 0.3𝑦 0.2𝑥 + 0.7𝑦) = (𝑥 𝑦), 𝑦 = 1 − 𝑥.
0.8𝑥 + 0.3(1 − 𝑥) = 𝑥, 𝑦 = 1 − 𝑥
3 2 3 2
By solving we get 𝑥 = 5 , 𝑦 = 5. Therefore, 𝑉 = (5 5
)

Conclusion: In the long run, 3/5 of the times he smokes filter cigarettes.
--------------------------------------------------------------------------------------------------------

Dr. Narasimhan G, RNSIT 18

You might also like