0% found this document useful (0 votes)
23 views13 pages

Probably Concepts

The document provides an overview of probability concepts, including mutually exclusive and exhaustive events, independent events, conditional probability, and the addition and multiplication rules of probability. It also introduces random variables, their distributions, and key terms such as expectation, variance, probability mass function, and probability distribution function. Numerous examples are included to illustrate these concepts and their applications.

Uploaded by

Mukund Tiwari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views13 pages

Probably Concepts

The document provides an overview of probability concepts, including mutually exclusive and exhaustive events, independent events, conditional probability, and the addition and multiplication rules of probability. It also introduces random variables, their distributions, and key terms such as expectation, variance, probability mass function, and probability distribution function. Numerous examples are included to illustrate these concepts and their applications.

Uploaded by

Mukund Tiwari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Module 1 .

Introduction of probability

Mutually exclusive events: (Disjoint events) Two events are Mutually exclusive if the occurrence
of one event prevents the occurrence of another event and vice versa. i.e A ∩ B = ∅
Mutually exhaustive events: The events A and B said to be Mutually exclusive if 𝐴 ∩ 𝐵 = 𝑆
Independent events: Two events are independent if the occurrence of one event doesn’t affects
the occurrence of another event.

i.e The events A and B are independent if P(A ∩ B) = P(A)P(B)


In general, P(𝐴1 ∩ 𝐴2 ∩ … … ∩ 𝐴𝑛 ) = P(𝐴1 )P(𝐴2 )… P(𝐴𝑛 ), if 𝐴1 , 𝐴1 , … 𝐴𝑛 are independent event

Conditional probability: If the Probability of the event A provided the event B has already
occurred is call the Conditional probability and is defined as
P(A∩B)
P(A / B) = P(B)
Provided P(B) ≠ 0

Similarly, The probability of the event B provided the event A has occurred already is given by
P(A∩B)
P(B / A) = P(A)
Provided P(A) ≠ 0

Note: 1. If events A and B are independent then P(A / B) = P(A)

2. If events A and B are mutually exclusive events then P(A / B) = 0 , P(B / A) = 0


Addition Rule of Probability:
Let A, B and C be the events of sample space S then

1. P(A ∪ B) = P(A)+P(B) - P(A ∩ B)


2. P(A ∪ B ∪ C ) = P(A)+P(B)+P(C)-P(A ∩ B)-P(A ∩ C)-P(B ∩ C)+P(A ∩ B ∩ C)
Multiplication Rule of probability:
Let A and B be the events of sample space S then P(A ∩ B) = P(A)P(B / A)=P(B)P(A / B)

Examples:
1. A box contain 4 bad and 6 good tubes. Two are drawn out from the box at a time. One of
them is tested and found to be good. What is the probability that the other one is also
good? Ans: P(B/A)=5/9
2. The gender breakdown in a math class of 40 students is 25 boys to 15 girls. During the
end-of-year exam, 12 boys and 5 girls made an A grade. If we were to pick a student at
random from the class, what would be the probability of choosing a girl or A grade
student?
3. If you take out a single card from a regular pack of cards, what is probability that the
card is either an ace or spade?
4. A bag contain 5 white and 3 black balls. Two balls are drawn at random one after the
other without replacement. Find the probability the both balls drawn are black.
Ans: P(A ∩ B)=3/28
1 5 1
5. In a random experiment, P(A)= , P(B)= and P(B/A)= , find P(A ∪ B).
12 12 15
Ans: P(A ∪ B)=89/180

Total probability theorem:


Statement: If 𝐵1 , 𝐵2 , 𝐵3 , … , 𝐵𝑛 are mutually exclusive and exhaustive sets of sample
space S and A is any event associated with the events 𝐵1 , 𝐵2 , 𝐵3 , … , 𝐵𝑛 , then
P(A)= ∑ni=1 P(𝐵i ) P( A / Bi )
Proof: Self Study
Bayes' Theorem:
Statement: Let 𝐴1 , 𝐴2 , 𝐴3 …, 𝐴𝑛 be the partition of the sample space S and let B be the any other
event of S such that P(𝐴𝑖 ) ≠ 0 , for every i=1,2,…n and P(B) ≠ 0 Then
P(𝐴𝑖 ) P( 𝐵 / 𝐴𝑖 )
P(𝐴𝑖 / B) = ∑𝑛
𝑖=1 P(𝐴𝑖 ) P( 𝐵 / 𝐴𝑖 )

𝑃𝑖 𝑃𝑖 ′
i.e P(𝐴𝑖 / B) = 𝑃 ′ ′ ′ Where, 𝑃𝑖 = P(𝐴𝑖 ) and 𝑃𝑖 ′=P( 𝐵 / 𝐴𝑖 )
1 𝑃1 +𝑃2 𝑃2 +⋯+𝑃𝑛 𝑃𝑛

proof:
We know that for any two events A, B of sample space S
P(𝐴∩𝐵)
Note that P(𝐴 / B) = P(𝐵)

P(𝐴𝑖 ∩𝐵)
∴ P(𝐴𝑖 / B) = P(𝐵)
……. (1)

P(𝐴𝑖 ∩𝐵)
𝑎𝑛𝑑 P(𝐵 / 𝐴𝑖 ) = ……. (2)
P(𝐴𝑖 )

from (2) we have , P(𝐴𝑖 ∩ 𝐵)=P(𝐴𝑖 )P(𝐵 / 𝐴𝑖 ) ………(3)


P(𝐴𝑖 )P(𝐵 / 𝐴𝑖 )
Equation (1) becomes , P(𝐴𝑖 / B) = P(𝐵)
………..(4)

We know that B= (𝐵 ∩ 𝐴1 ) ∪ (𝐵 ∩ 𝐴2 ) ∪ … ∪ (𝐵 ∩ 𝐴𝑛 )

∴ P(B)= P(𝐵 ∩ 𝐴1 ) + 𝑃(𝐵 ∩ 𝐴2 ) + ⋯ + 𝑃(𝐵 ∩ 𝐴𝑛 )

∴ P(B)= P(𝐴1 ∩ 𝐵) + 𝑃(𝐴2 ∩ 𝐵) + ⋯ + 𝑃(𝐴𝑛 ∩ 𝐵)

∴ P(B)=P(𝐴1 )P(𝐵 / 𝐴1 ) + P(𝐴2 )P(𝐵 / 𝐴2 )+...+P(𝐴𝑛 )P(𝐵 / 𝐴𝑛 ) ….using (3)

∴ P(B)=P(𝐴1 )P(𝐵 / 𝐴1 ) + P(𝐴2 )P(𝐵 / 𝐴2 )+...+P(𝐴𝑛 )P(𝐵 / 𝐴𝑛 )


𝑛

∴ P(B)= ∑ P(𝐴𝑖 ) P( 𝐵 / 𝐴𝑖 )
𝑖=1

Equation (4) becomes

P(𝐴𝑖 )P(𝐵 / 𝐴𝑖 )
P(𝐴𝑖 / B) =
P(𝐵)
P(𝐴𝑖 )P(𝐵 / 𝐴𝑖 )
∴ P(𝐴𝑖 / B) =
∑𝑛𝑖=1 P(𝐴𝑖 ) P( 𝐵 / 𝐴𝑖 )

Examples:

1. A box contain 7red and 13 blue balls. Two balls are selected at random and are
discarded without their colors being seen. If a third ball is drawn randomly and
observed to be red, what is the probability that both of the discarded ball were blue?
2. A bag contains 7 red and 3 black balls and another bag contains 4 red and 5 black balls.
One ball is transferred from the first bag to the second bag and then a ball is drawn from
the second bag. If this ball happens to be red, find the probability that a black ball was
transferred.
3. A company has two plants to manufacture scooters. Plant I manufactures 80% of the
scooters and plant II the rest. At plant I, 85 out of 100 scooters are rated higher quality
and at plant II, only 65 out of 100 scooters are rated higher quality. A scooter is chosen
at random. What is probability that the scooter came from plant II, if it is known that the
scooter is of higher quality?
4. A factory production line is manufacturing bolts using three machines, A, B and C. Of the
total output, machine A is responsible for 25%, machine B for 35% and machine C for the
rest. It is known from previous experience with the machines that 5% of the output from
machine A is defective, 4% from machine B and 2% from machine C. A bolt is chosen at
random from the production line and found to be defective. What is the probability that
it came from (a) machine A (b) machine B (c) machine C?
5. There are in a bag three true coins and one false coin with head on both sides. A coin is
chosen at random and tossed four times. If head occurs all the four times. What is the
probability that the false coin was chosen and used?
6. The chance that doctor A will diagnose a disease X correctly is 60%. The chance
that a patient will die by his treatment after correct diagnosis is 40% and the
chance of death by wrong diagnosis is 70%. A patient of doctor A, who had
disease X, died. What is the chance that his disease was diagnosed correctly?
7. A man is known to speak the truth 2 out of 3 times. He throws a die and reports that the
number obtained is a four. Find the probability that the number obtained is actually a
four.

8. From the pack of 52 cards, one card is lost. From the remaining cards of a pack, two
cards are drawn and both are found to be diamond cards. What is the probability that
the lost card is a diamond?
Examples on Total probability Theorem:
1. A factory produces its entire output with three machines. Machines I, II, and III produces
50%, 30% and 20% of output, but 4%, 2% and 4% of their outputs are defective
respectively. What fraction of the total output is defective? Ans: 0.034
2. In a coin tossing experiment if the coin shows head, one die is thrown and the number is
recorded. If the coin shows tail, two dice are thrown and their sum is recorded. What is
the probability that the recorded number will be 2 ? Ans: 7/72
3. A box contain 5 red and 4 white balls. A ball from the box is taken out at random and
kept outsides. If once again a ball is drawn from the box, what is the probability that the
drawn ball is red? Ans: 5/9
Module 2 . Random Variables and its distributions
Random Experiment:
The Random variable is the function that maps from the sample space of an random experiment
to the real numbers. (i.e 𝑋: 𝑆 → 𝑅 )
For ex. Tossing 2 coins
Note:
1. If 𝑋1 and 𝑋2 are Random variables and 𝐶1 𝑎𝑛𝑑 𝐶2 be any constants then 𝐶1 𝑋1 + 𝐶1 𝑋2 , ,
|is also Random variable
1
2. If 𝑋 is Random variables then 𝑋 and |X| is also Random variables

Discrete random variable:


A variable X is said to be D.R.V. if X takes finite or countably infinite values 𝑥0 , 𝑥1 , 𝑥2 , … , 𝑥𝑛 , …
Note: 1. Expectation E(X): If X is DRV then Expectation of X is denoted by E(X) and is defined as
E(X)= ∑𝑖 𝑃(𝑥𝑖 ) 𝑥𝑖
Mean: E(X) = ∑𝑖 𝑃(𝑥𝑖 ) 𝑥𝑖
Variance: V(X) = E(𝑋 2 ) − [E(X)]2
Probability mass function:
If X is D.R.V. and 𝑥0 , 𝑥1 , 𝑥2 , , … , 𝑥𝑛 , … be the values of X and 𝑃(𝑥0 ), 𝑃(𝑥1 ), 𝑃(𝑥2 ), … , , 𝑃(𝑥𝑛 ) … be
the corresponding probabilities then the function P is called PMF if
I) 𝑃(𝑥𝑖 ) ≥ 0 , 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑖
II) ∑𝑖 𝑃(𝑥𝑖 ) = 1

Probability distribution function:


Let X be D.R.V then probability distribution function of X is defined as
𝐹(𝑥) = 𝑃(𝑋 ≤ 𝑥)=∑𝑥𝑖<𝑥 𝑃(𝑋 = 𝑥𝑖 ) , i=1, 2, 3…

Examples:
1. If X is the discrete random variable with following probability distribution table
X 1 2 3 4 5 6 7
P(X=x) k 2k 3k 𝑘2 𝑘2 + 𝑘 2𝑘 2 4𝑘 2

𝑋<5
Find i) k ii) 𝑃(𝑋 < 5) iii) 𝑃 (2<𝑋≤6)
Ans: 1/8 , 49/64, 25/36
2. The probability distribution of a random variable X is
X 0 1 2 3
P(X=x) 0.1 0.3 0.5 0.1

If Y=𝑋 2 + 2𝑋 , find mean and variance and 𝑃(𝑌 ≤ 5) ans: 6.4, 16.24, 0.9
3. If the random variable X takes the values 1,2,3 and 4 such that
2𝑃(𝑥 = 1) = 3𝑃(𝑥 = 2) = 𝑃(𝑥 = 3) = 5𝑃(𝑥 = 4). Find the probability distribution and
mean
𝑘𝑥 , 𝑥 = 1,2,3,4,5
4. If 𝑃(𝑋 = 𝑥) = { represent the probability, find k and
0 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
1 5
𝑃 (2 < 𝑋 ≤ 2 /𝑋 > 1)

Continuous random variable:


A variable X is said to be C.R.V. if X takes uncountable infinite values 𝑖𝑛 𝑡ℎ𝑒 𝑔𝑖𝑣𝑒𝑛 𝑖𝑛𝑡𝑒𝑟𝑣𝑎𝑙
For ex, X=[0,1]
Probability density function:
Let f(x) be the continuous function defined on [a,b] and X be the CRV then PDF of X is
𝛽
𝑃(𝛼 ≤ 𝑋 ≤ 𝛽) = ∫𝛼 𝑓(𝑥)𝑑𝑥 , 𝑤ℎ𝑒𝑟𝑒 𝑎 < 𝛼 < 𝛽 < 𝑏

Expectation E(X):

If X is CRV then Expectation of X is denoted by E(X) and is defined as E(X) = ∫−∞ 𝑥𝑓(𝑥)𝑑𝑥

Mean: E(X)
Variance: V(X) = E(𝑋 2 ) − [E(X)]2

Cumulative distribution function:


Let f(x) be the continuous function defined on [a,b] then the cumulative distribution F(x) of CRV
𝑥
X with PDF f(x) is given by 𝐹(𝑥) = 𝑃(𝑋 ≤ 𝑥) = ∫−∞ 𝑓(𝑥)𝑑𝑥 , 𝑤ℎ𝑒𝑟𝑒 − ∞ < 𝑥 < ∞

Note: 𝑃(𝑎 ≤ 𝑋 ≤ 𝑏) = 𝐹(𝑏) − 𝐹(𝑎)

Examples:
2𝑥 ,0 < 𝑥 < 1
1. A random variables X has the PDF 𝑓(𝑥) = {
0 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
1 1 1 3 3
Find i) 𝑃 (𝑋 < 2) ii) 𝑃 (4 < 𝑋 < 2) iii) 𝑃 (𝑋 > 4 / 𝑋 > 4) iv) CDF of X
2. A random variables X has the PDF 𝑓(𝑥) = 𝑘𝑥(1 − 𝑥) , 0 ≤ 𝑋 ≤ 1
Find i) 𝑘 ii) 𝑑𝑒𝑡𝑒𝑟𝑚𝑖𝑛𝑒 𝑡ℎ𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑏 𝑠𝑢𝑐ℎ 𝑡ℎ𝑎𝑡 𝑃(𝑋 ≤ 𝑏) = 𝑃(𝑋 ≥ 𝑏)
𝑥, 0<𝑥<1
3. The PDF of random variables X is 𝑓(𝑥) = {2 − 𝑥 , 1 < 𝑥 < 2
0, 𝑋>2
Find CDF of X
Two dimensional random variables:
Let X and Y be two random variables defined on the same sample space then the function (X , Y)
that assign a point in 𝑅 2 (= 𝑅 × 𝑅) is called a two-dimensional random variable.
Notation:
1) { 𝑋 ≤ 𝑎, 𝑌 ≤ 𝑏} denotes the event of all elements 𝑠 ∈ 𝑆 such that 𝑋(𝑠) ≤ 𝑎 and 𝑌(𝑠) ≤ 𝑏
2) Probability of event { 𝑋 ≤ 𝑎, 𝑌 ≤ 𝑏} is denoted as 𝑃{𝑋 ≤ 𝑎, 𝑌 ≤ 𝑏}

Discrete random variable:


Join probability distribution for (X , Y):

Let (X , Y) be a two dimensional discrete R.V on sample space with respective image sets
𝑋(𝑠) = {𝑥1 , 𝑥2 , , … , 𝑥𝑛 } and 𝑌(𝑠) = {𝑦1 , 𝑦2 , , … , 𝑦𝑚 }. Then function 𝑝 on 𝑋(𝑠) × 𝑌(𝑠) defined by

𝑝 (𝑥𝑖 , 𝑦𝑗 ) = 𝑃 {𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 } = 𝑝𝑖𝑗 is called Joint probability distribution for (X , Y) and


is represented in the following way
Y 𝑦1 𝑦2 … 𝑦𝑚 Total
X

𝑥1 𝑝11 𝑝12 … 𝑝1𝑚 𝑝1∗

𝑥2 𝑝21 𝑝22 … 𝑝2𝑚 𝑝2∗

… … ... … … …

𝑥𝑛 𝑝21 𝑝22 … 𝑝2𝑚 𝑝𝑛∗

Total 𝑝∗1 𝑝∗2 … 𝑝∗𝑚

Marginal probability distribution for (X , Y):

Let (X , Y) be a discrete two dimensional R.V. Then


The Marginal probability distribution of the function of X is
𝑃{𝑋 = 𝑥𝑖 } = ∑𝑚
𝑗=1 𝑝𝑖𝑗 = 𝑝𝑖1 + 𝑝𝑖2 + ⋯ 𝑝𝑖𝑚 = 𝑝𝑖∗ and

The Marginal probability distribution of the function of Y is

𝑃{ 𝑌 = 𝑦𝑗 } = ∑𝑛𝑖=1 𝑝𝑖𝑗 = 𝑝1𝑗 + 𝑝2𝑗 + ⋯ 𝑝𝑛𝑗 = 𝑝∗𝑗


Join probability mass function (Two-dimensional Probability mass function):

The function 𝑝 (𝑥𝑖 , 𝑦𝑗 ) = 𝑝𝑖𝑗 is called Joint probability mass function if it satisfy the
following conditions:
i) 𝑝𝑖𝑗 ≥ 0 , for all 𝑖 , 𝑗
ii) ∑𝑚 𝑛
𝑗=1 ∑𝑖=1 𝑝𝑖𝑗 = 1

Cumulative distribution function:


The cumulative distribution function of two-dimensional discrete random variable (X, Y) is
denoted by F(x, y) and is defined by 𝑃(𝑋 ≤ 𝑥𝑖 , 𝑌 ≤ 𝑦𝑗 ) = ∑𝑚 𝑛
𝑗=1 ∑𝑖=1 𝑝𝑖𝑗

Conditional probability distribution:

Let (X, Y) be a discrete two dimensional R.V then the conditional probability distribution of X is
defined as
𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) 𝑝𝑖𝑗
𝑃(𝑋 = 𝑥𝑖 /𝑌 = 𝑦𝑗 ) = =
𝑃(𝑌 = 𝑦𝑗 ) 𝑝∗𝑗

Similarly, the conditional probability distribution of Y is defined as


𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) 𝑝𝑖𝑗
𝑃(𝑌 = 𝑦𝑗 /𝑋 = 𝑥𝑖 ) = =
𝑃(𝑋 = 𝑥𝑖 ) 𝑝𝑖∗
Independent random variables:
The two dimensional discrete random variable (X, Y) is said to be Independent if 𝑝𝑖𝑗 = 𝑝𝑖∗ 𝑝∗𝑗

Examples:
I. The join probability mass function of (X, Y) is given by 𝑃(𝑥, 𝑦) = 𝑘(2𝑥 + 3𝑦) , x=0,1,2
and y=1,2,3 find marginal and conditional probability distributions for
1. 𝑃(𝑋 = 2, 𝑌 ≤ 2) 2. 𝑃(𝑋 ≤ 2, 𝑌 = 3) 3. 𝑃(𝑋 = 2) 4. 𝑃(𝑋 ≤ 1/𝑌 ≤ 2)
Ans: k= 1/72 , 17/72, 20/72, 30/72, 22/39
II. Given the following bivariate probability distribution.

X −1 0 1
Y
0 1 2 1
15 15 15

3 2 1
1 15 15 15

2 1 2
15 15 15
2

Find a. Marginal distribution of X and Y


b. Conditional distribution of X given that Y=2
c. Check whether the random variables (X, Y) are independent
Continuous random variable:
Join probability density function for (X , Y):
Let f(x, y) be the bivariate function and (X, Y) be two dimensional continuous random variable
then f(x, y) is said to be the join probability density function of (X, Y) if
1. 𝑓(𝑥, 𝑦) ≥ 0 , ∀ (𝑥, 𝑦) ∈ 𝑅
∞ ∞
2. ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 1
𝑏 𝑑
Note: 𝑃(𝑎 ≤ 𝑋 ≤ 𝑏, 𝑐 ≤ 𝑌 ≤ 𝑑) = ∫𝑎 ∫𝑐 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦

Marginal probability density function:


Let (X, Y) be two dimensional continuous random variable then

Marginal density function of random variables X is defined as 𝑓𝑋 (𝑥) = ∫−∞ 𝑓(𝑥, 𝑦) 𝑑𝑦 and

Marginal density function of random variables Y is defined as 𝑓𝑌 (𝑦) = ∫−∞ 𝑓(𝑥, 𝑦) 𝑑𝑥

Cumulative distribution function:


The cumulative distribution function of two-dimensional continuous random variable (X, Y) is
𝑦 𝑥
denoted by F(x, y) and is defined by 𝐹(𝑥, 𝑦) = 𝑃(𝑋 ≤ 𝑥, 𝑌 ≤ 𝑥) = ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦

Note: 𝐹(−∞, ∞) = 1

Conditional probability distribution:


Let (X, Y) be two dimensional continuous random variable then
𝑓(𝑥,𝑦)
The conditional probability distribution of X given that X is defined as 𝑓𝑋/𝑌 (𝑥/𝑦) = 𝑓𝑌 (𝑦)
and

𝑓(𝑥,𝑦)
The conditional probability distribution of X given that Y is defined as 𝑓𝑌/𝑋 (𝑦/𝑥) =
𝑓𝑋 (𝑥)

Independent random variables:


The two dimensional continuous random variable (X, Y) is said to be Independent
If 𝑓(𝑥, 𝑦) = 𝑓𝑋 (𝑥) 𝑓𝑌 (𝑦)
Note: If X and are independent then 𝑓(𝑥/𝑦) = 𝑓(𝑥) and 𝑓(𝑦/𝑥) = 𝑓(𝑦)

Examples:
1. Find K if the join PDF of a bivariate random variable (X, Y) is given by
𝑘(1 − 𝑥)(1 − 𝑦), 𝑖𝑓 0 < 𝑥 < 4 ; 1 < 𝑦 < 5
𝑓(𝑥, 𝑦) = {
0 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
2. Find K if the join PDF of a bivariate random variable (X, Y) is given by
𝑓(𝑥, 𝑦) = 𝑘𝑦𝑒 −𝑥 , 𝑥 > 0 ; 0 < 𝑦 < 2
−𝑥(𝑦+1), 𝑥≥0,𝑦≥0
3. If 𝑓(𝑥, 𝑦) = {𝑥𝑒 is join PDF of a two dimensional random variable (X, Y),
0 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Find the 1) marginal density functions. 2) Conditional density functions. 3) Are X and Y
independent
4. The join PDF of a bivariate random variable (X, Y) is given by
𝑘𝑥𝑦, 𝑖𝑓 0 < 𝑥 < 1 ; 0 < 𝑦 < 1
𝑓(𝑥, 𝑦) = {
0 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Find 1. K 2. Find P(X+Y<1) 3. Are X and Y independent random variable
5. The joint PDF of two dimensional random variables (X, Y) is given by
𝑥2
𝑓(𝑥, 𝑦) = 𝑥𝑦 2 + , 0 ≤ 𝑥 ≤ 2 ;0 ≤ 𝑦 ≤ 1
8
1 1
Find i) P(X>1) ii) P(Y< 2) iii) P(X>1 / Y< 2 ) iv) P(X<Y)

Expectation of two dimensional random variable:


1. For discrete Random variable:
If (X, Y) be two dimensional discrete random variable then

𝐸(𝑋) = ∑𝑖 𝑃(𝑥𝑖 ) 𝑥𝑖 , 𝐸(𝑌) = ∑𝑗 𝑃(𝑦𝑗 ) 𝑦𝑗

𝐸(𝑋𝑌) = ∑𝑥𝑖 ∑𝑦𝑗 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) 𝑥𝑖 𝑦𝑗 = ∑𝑥𝑖 ∑𝑦𝑗 𝑃(𝑥𝑖 , 𝑦𝑗 ) 𝑥𝑖 𝑦𝑗

𝐸(𝑋/𝑌) = ∑𝑖 𝑃(𝑋 = 𝑥𝑖 /𝑌 = 𝑦𝑗 ) 𝑥𝑖

𝐸(𝑌/𝑋) = ∑𝑖 𝑃(𝑌 = 𝑦𝑗 /𝑋 = 𝑥𝑖 ) 𝑦𝑗

2. For continuous Random variable:


If (X, Y) be two dimensional continuous random variable then
∞ ∞
𝐸(𝑋) = ∫−∞ 𝑥 𝑓𝑋 (𝑥)𝑑𝑥 , 𝐸(𝑌) = ∫−∞ 𝑦 𝑓𝑌 (𝑦)𝑑𝑦
∞ ∞
𝐸(𝑋𝑌) = ∫−∞ ∫−∞ 𝑥𝑦 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦

𝐸(𝑋/𝑌) = ∫−∞ 𝑥𝑓(𝑥/𝑦)𝑑𝑥

𝐸(𝑌/𝑋) = ∫−∞ 𝑦𝑓(𝑦/𝑥)𝑑𝑦

Note: 1. Mean =Expectation 2. Variance: Var(X) = E(𝑋 2 ) − [E(X)]2


Covariance: Cov(X, Y)=E(XY) - E(X)E(Y)
Properties:
E(aX)=aE(X) , E(k)=k

Var(aX)= 𝑎2 Var(X) , Var(k)=0

Var(aX+bY)= 𝑎2 Var(X)+ 𝑏 2 Var(Y)+2ab Cov(X ,Y),

Var(aX-bY)= 𝑎2 Var(X)+ 𝑏 2Var(Y)-2ab Cov(X ,Y)


If X and Y are independent then Cov(X, Y)=0
Cov(X+Y, X-Y)=Var(X) - Var(Y)
Cov(aX+bY, cX+dY)=acVar(X) +bdVar(Y)+(ad+bc)Cov(X, Y)
Examples:
I. Given the following bivariate probability distribution.
X −1 0 1 Total
Y
−1 0 0.1 0.1 0.2

0 0.2 0.2 0.2 0.6

1 0 0.1 0.1 0.2

Total 0.2 0.4 0.4 1.0


Find E(x), E(Y), E(XY), Var(X) and Var(y) and prove that X and Y are uncorrelated
II. If the joint PDF is given by f(x, y)= x+ y , 0 ≤ 𝑥 ; 𝑦 ≤ 1. Find E(X), E(Y), E(XY)
III. If Y= -2X +3 then find Cov(X , Y)
−𝑥(𝑦+1), 𝑥≥0,𝑦≥0
IV. If 𝑓(𝑥) = {𝑥𝑒 is join PDF of a two dimensional random variable (X, Y),
0 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Find conditional mean and variance of Y given that X.

Moments of random variable:


If X is any random variable then

1. The 𝑟 𝑡ℎ moment of X about origin is


𝜇𝑟 ′ = 𝐸(𝑋 𝑟 ) , r=1, 2, 3…

2. The 𝑟 𝑡ℎ moment of X about mean 𝑋̅(Central Moment) is


𝜇𝑟 = 𝐸[(𝑋 − 𝑋̅)𝑟 ] , r=1, 2, 3… (here , 𝑚𝑒𝑎𝑛 = 𝑋̅ = 𝐸(𝑋) )

3. The 𝑟 𝑡ℎ moment of X about A (Raw Moment) is

𝜇𝑟 ′ = 𝐸[(𝑋 − 𝐴)𝑟 ] , r=1, 2, 3… (here , 𝑚𝑒𝑎𝑛 = 𝑋̅ = 𝜇1 ′ + 𝐴 )

Relation between moment about origin and moment about mean of random variable:
𝐹𝑖𝑟𝑠𝑡 𝑚𝑜𝑚𝑒𝑛𝑡 𝜇1 = 0
𝑆𝑒𝑐𝑜𝑛𝑑 𝑚𝑜𝑚𝑒𝑛𝑡 𝜇2 = 𝜇2 ′ − (𝜇1 ′ )2
𝑇ℎ𝑖𝑟𝑑 𝑚𝑜𝑚𝑒𝑛𝑡 𝜇3 = 𝜇3 ′ − 3𝜇2 ′ 𝜇1 ′ + 2(𝜇1 ′ )3
𝐹𝑜𝑢𝑟𝑡ℎ 𝑚𝑜𝑚𝑒𝑛𝑡 𝜇4 = 𝜇4 ′ − 4𝜇3 ′ 𝜇1 ′ + 6𝜇2 ′ (𝜇1 ′ )2 − 3(𝜇1 ′ )4 and so on..
(Hint : consider 𝜇𝑟 = 𝐸[(𝑋 − 𝑋̅)𝑟 ] and expand )
Moment generating function:
The moment generating function of a random variable X is defined as
𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑥𝑡 )
𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑥𝑡 )
Note that: 1. If X is discrete random variable then 𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑥𝑡 ) = ∑𝑥 𝑒 𝑥𝑡 𝑝(𝑥)

If X is continuous random variable then 𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑥𝑡 ) = ∫−∞ 𝑒 𝑥𝑡 𝑓(𝑥)𝑑𝑥

2. Moments using power series of MGF


𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑥𝑡 )
𝑡 𝑡2 𝑡𝑟
= 𝐸(1 + 1! 𝑥 + 2! 𝑥 2 + ⋯ + 𝑟! 𝑥 𝑟 + ⋯ )

𝑡 𝑡2 𝑡𝑟
= 1 + 1! 𝐸(𝑋) + 2! 𝐸(𝑋 2 ) + ⋯ + 𝑟! 𝐸(𝑋 𝑟 ) + ⋯

𝑡 𝑡2 𝑡𝑟
= 1 + 1! 𝜇1 ′ + 2! 𝜇2 ′ + ⋯ + 𝑟! 𝜇𝑟 ′ + ⋯

Which is MGF in terms of moments.


𝒕𝒓
Hence, 𝝁𝒓 ′ = 𝒄𝒐𝒆𝒇𝒇𝒊𝒄𝒊𝒆𝒏𝒕 𝒐𝒇 𝒓!

3. Moments using MGF


𝑑
𝜇1 ′ = [ 𝑀𝑋 (𝑡)]
𝑑𝑡 𝑡=0

𝑑2
𝜇2 ′ = [𝑑𝑡 2 𝑀𝑋 (𝑡)]
𝑡=0

𝑑𝑟
In general, 𝜇𝑟 ′ = [𝑑𝑡 𝑟 𝑀𝑋 (𝑡)]
𝑡=0

4. MGF is not exist for every random variable X


Examples:
1. The first four moments of a distribution about X=4 are 1,4, 10 and 45 resp. Show
that mean is 5 and variance is 3 , 𝜇3 ′ = 0 and 𝜇4 ′ = 26
2. The first four moments of a distribution about the value 4 are -1.5, 17, -30 and
108 resp. find the mean the moments about the mean
3. Find the MGF of random variable X whose probability function
1
𝑃(𝑋 = 𝑥) = , x=1, 2, 3, … . find its mean
2𝑥
3e 3x x0
4. If f  x    is a pdf of a r.v. X, Using moment generating
0 otherwise
function, find the first three moments about origin. Find first three central moments.
𝑥 , 𝑓𝑜𝑟 0 < 𝑥 < 1
5. Find MGF of the random variable X having PDF 𝑓(𝑥) = {2 − 𝑥 , 𝑓𝑜𝑟 1 < 𝑥 < 2
0 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Module 3. Special Probability Distributions

Discrete Distribution:

1. Binomial Distribution:
Let n be the number of trials, p be the probability of success in each trials and
q be the probability of failure in each trials where, q=1-p
Then the random variable X is said to follow the binomial distribution if the
probability of x is given by 𝑃(𝑋 = 𝑥) = (𝑛𝑥)𝑝 𝑥 𝑞 𝑛−𝑥

Note : 1 n must be finite

2 Mean=E(X)= np

3 Variance=V(X)= npq

4 In Binomial distribution, mean > variance

5. MGF of binomial distribution: 𝑀𝑋 (𝑡) = 𝑒 −𝑛𝑝𝑡 (𝑞 + 𝑝𝑒 𝑡 )𝑛

6. If 𝑋1 𝑎𝑛𝑑 𝑋2 are binomial variable then 𝑀𝑋1 +𝑋2 (𝑡) = 𝑀𝑋1 (𝑡) + 𝑀𝑋2 (𝑡)

Examples:
1. The mean and variance of binomial distribution are 16 and 8. Find 𝑃(𝑋 ≥ 3)
2. In 256 sets of 12 tosses of fair coin, in how many cases may one expect 8 heads
and 4 tails?
3. In a certain town, 20% samples of the population are literate. Assume that 200
investigators each take samples of 10 individuals to see whether they are
literate. How many investigators would you expect to report that 3 people or less
are literate in the samples?

2. Poisson Distribution:
The discrete random variable X is said to follow the Poisson distribution if
𝑒 −𝜆 𝜆𝑥
the probability of x is given by 𝑃(𝑋 = 𝑥) = , 𝜆 > 0, 𝑥 = 0,1,2,3, …
𝑥!

Note : 1 n(no of trials) is infinitely large

2 Mean= λ= Variance
𝑡 −1)
5. MGF of binomial distribution: 𝑀𝑋 (𝑡) = 𝑒 𝜆(𝑒
Examples:
1. If X and Y are independent Poisson variables such that
𝑃(𝑋 = 1) = 𝑃(𝑋 = 2), 𝑃(𝑌 = 1) = 𝑃(𝑌 = 3) , find the variance of (𝑋 − 2𝑌)
2. If X and Y are independent Poisson variables such that
𝑃(𝑋 = 1) = 𝑃(𝑋 = 2), find 𝑃(𝑋 ≥ 3)
3. In a component manufacturing industry there is asmall chance of 1/500 for any
component to be defective. The components are supplied in packets of 10. Use
Poisson distribution to calculate poisson distribution to calculate the
approximate number of packets containing
i) No defective
ii) One defective
iii) 2 defective components respectively in a consignment of 10000 packets.
4. An insurance company insures 4000 people against loss of both eyes in a car
accidents. Based on previous data, the rates were computed on the assumption
that on the average 10 persons in 100000 will have car accident each year that
result in this type of injury. What is the probability that more than 3 of the
insured will collect on their policy in a given year?

You might also like