Probability (Merged)
Probability (Merged)
Let X and Y be two discrete random variables. Let 𝑝(𝑥, 𝑦) be a function such that 𝑝(𝑥, 𝑦) =
𝑃(𝑋 = 𝑥, 𝑌 = 𝑦), then 𝑝(𝑥, 𝑦) is called joint probability function of X and Y, if the following
conditions are satisfied
(i) 𝑝(𝑥, 𝑦) ≥ 0
(ii) ∑𝑛𝑖=1 ∑𝑚
𝑗=1 𝑃(𝑥𝑖 , 𝑦𝑗 ) = 1
Marginal distribution of X
In the bivariate probability distribution, if the probability mass function of only X is taken then it is
called Marginal distribution of X, denoted by 𝑃𝑖 or 𝑃(𝑥𝑖 ) or 𝑃(𝑥).
𝑚
𝑃(𝑥𝑖 ) = 𝑃𝑖 = ∑ 𝑃(𝑥𝑖 , 𝑦𝑗 )
𝑗=1
Marginal distribution of Y
In the bivariate probability distribution, if the probability mass function of only Y is taken then it is
called Marginal distribution of Y, denoted by 𝑃𝑗 or 𝑃(𝑦𝑗 ) or 𝑃(𝑦).
𝑛
𝑃(𝑦𝑗 ) = 𝑃𝑗 = ∑ 𝑃(𝑥𝑖 , 𝑦𝑗 )
𝑖=1
The set of values of the 𝑃(𝑥𝑖 , 𝑦𝑗 ) = 𝑃𝑖𝑗 for 𝑖 = 1,2, … 𝑛, 𝑗 = 1,2, … 𝑚 is called the joint probability
distribution of X and Y
Y
𝑦1 𝑦1 … 𝑦𝑗 … 𝑦𝑚 ∑ 𝑦𝑖
X
𝑖
𝑥1 𝑝(𝑥1 , 𝑦1 ) 𝑝(𝑥1 , 𝑦2 ) … 𝑝(𝑥1 , 𝑦𝑗 ) … 𝑝(𝑥1 , 𝑦𝑚 ) 𝑃1
𝑥2 𝑝(𝑥2 , 𝑦1 ) 𝑝(𝑥2 , 𝑦2 ) … 𝑝(𝑥2 , 𝑦𝑗 ) … 𝑝(𝑥2 , 𝑦𝑚 ) 𝑃2
⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮
𝑥𝑖 𝑝(𝑥𝑖 , 𝑦1 ) 𝑝(𝑥𝑖 , 𝑦2 ) … 𝑝(𝑥𝑖 , 𝑦𝑗 ) … 𝑝(𝑥𝑖 , 𝑦𝑚 ) 𝑃𝑖
⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮
𝑥𝑛 𝑝(𝑥𝑛 , 𝑦1 ) 𝑝(𝑥𝑛 , 𝑦2 ) … 𝑝(𝑥𝑛 , 𝑦𝑗 ) … 𝑝(𝑥𝑛 , 𝑦𝑚 ) 𝑃𝑛
∑ 𝑥𝑖 𝑄1 𝑄2 … 𝑄𝑗 … 𝑄𝑚 1
𝑖
Independent Random variables
Two random variables X and Y are said to be independent, if their joint probability mass function
equal to the product of their marginal distribution
OR
OR
𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌)
If X and Y are two discrete random variables having the joint probability 𝑃(𝑥, 𝑦) then the
expectations of X and Y are defined as follows
𝐸(𝑋𝑌) = ∑𝑥𝑦𝑝(𝑥, 𝑦)
Variance
Standard deviation
Problems:
Y
1 2 3
X
2 0.2 0.1 0
4 0.1 0.1 0.1
6 0.2 0.2 0
Solution: Given
Y P(X)
1 2 3
X
2 0.2 0.1 0 0.3
4 0.1 0.1 0.1 0.3
6 0.2 0.2 0 0.4
P(Y) 0.5 0.4 0.1 1
Marginal distribution of X is
X 2 4 6
P(X) 0.3 0.3 0.4
Marginal distribution of Y is
Y 1 2 3
P(Y) 0.5 0.4 0.1
𝜎𝑋 = √𝑉(𝑋) = 1.6613
𝜎𝑌 = √𝑉(𝑌) = 0.6633
𝐸(𝑋𝑌) = ∑𝑥𝑦𝑝(𝑥, 𝑦)
Y
1 2 3
X
2 0.2 0.1 0
4 0.1 0.1 0.1
6 0.2 0.2 0
𝑐𝑜𝑣(𝑋, 𝑌) 0.08
𝑟= = = 0.0725
𝜎𝑋 𝜎𝑦 1.6613 × 0.6633
2. A joint probability distribution is given by the following table
Y
-3 2 4
X
1 0.1 0.2 0.2
3 0.3 0.1 0.1
(ii) 𝜇𝑥 , 𝜇𝑦 , 𝜎𝑥 , 𝜎𝑦
Solution:
Y P(X)
-3 2 4
X
1 0.1 0.2 0.2 0.5
3 0.3 0.1 0.1 0.5
P(Y) 0.4 0.3 0.3 1
Marginal distribution of X
X 1 3
P(X) 0.5 0.5
Marginal distribution of Y
Y -3 2 4
P(Y) 0.4 0.3 0.3
𝑉(𝑋) = 5 − 22 = 1
𝑉(𝑌) = 𝐸(𝑌 2 ) − [𝐸(𝑌)]2
𝜎𝑋 = √𝑉(𝑋) = 1
𝜎𝑌 = √𝑉(𝑌) = 3.0397
𝐶𝑂𝑉(𝑋, 𝑌) −1.2
𝑟= = = −0.3947
𝜎𝑋 𝜎𝑌 3.0397
3. A coin is tossed three times. Let 𝑋 be equal to 0 or 1 according as a head or a tail occurs on
the first toss. Let 𝑌 be equal to the total number of heads which occurs. Determine (i) the
marginal distributions of 𝑋 and 𝑌, and (ii) the joint distribution of 𝑋 and 𝑌, (iii) expected
values of 𝑋, 𝑌, 𝑋 + 𝑌 and 𝑋𝑌, (iv) 𝜎𝑋 and 𝜎𝑌 , (v) 𝐶𝑜𝑣(𝑋, 𝑌) and 𝜌(𝑋, 𝑌).
(i) The distribution of the random variable 𝑋 is given by the following table
X 0 1
(First toss Head or Tail) (First toss Head) (First toss Tail)
P(X) 4 4
(Probability of random variable X) 8 8
4 4 4
E[X] = μX = ∑ xi P(xi ) = (0 × ) + (1 × ) = = 0.5
8 8 8
1 3 3 1 12
E[Y] = μY = ∑ yj P(yj ) = (0 × ) + (1 × ) + (2 × ) + (3 × ) = = 1.5
8 8 8 8 8
OR
4 4 4 2 1
σ2X = E[X 2 ] − [E(X)]2 = (02 × 8) + (12 × 8) − (8) = 4
1 3 3 1 2 2 3
σ2Y = E[Y 2 ] − [E(Y)]2 = (02 × ) + (12 × ) + (22 × ) + (32 × ) − ( ) =
8 8 8 8 8 4
1 1 3 1
Cov(X, Y) = E(XY) − E(X)E(Y) = 2 − 2 × 2 = − 4
Cov(X,Y) −1/4 1
r = ρ(X, Y) = σX σY
= (1/2)( =− .
√3/2) √3
4. A joint probability distribution is given by the following table
Y
2 3 4
X
1 0.06 0.15 0.09
2 0.14 0.35 0.21
Determine the marginal distributions of X and Y. Also verify that X and Y are independent.
Solution:
Y P(X)
2 3 4
X
1 0.06 0.15 0.09 0.3
2 0.14 0.35 0.21 0.7
P(Y) 0.2 0.5 0.3 1
Here
𝑃1 = 0.3, 𝑃2 = 0.7
𝑄1 = 0.2, 𝑄2 = 0.5, 𝑄3 = 0.3
OR
Alternate method:
Marginal distribution of X
X 1 2
P(X) 0.3 0.7
𝐸(𝑋) = 1.7
Y 2 3 4
P(Y) 0.2 0.5 0.3
𝐸(𝑌) = 3.1
𝐸(𝑋𝑌) = 0.12 + 0.45 + 0.36 + 0.56 + 2.1 + 1.68 = 5.27
5. A probability distributions of two stochastically independent random variables X and Y are given
by the following table.
𝑋 0 1 𝑌 1 2 3
P(X) 0.2 0.8 P(Y) 0.1 0.4 0.5
Find the joint probability distribution. Also compute E(X) and E(Y).
Solution:
X \Y 1 2 3 P(X)
0 0.02 0.08 0.1 0.2
1 0.08 0.32 0.4 0.8
P(Y) 0.1 0.4 0.5 1
𝐸(𝑋) = 0.8
𝐸(𝑌) = 2.4
𝐸(𝑋𝑌) = ∑𝑥𝑦𝑝(𝑥, 𝑦) = 0 + 0 + 0 + 0.08 + 0.64 + 1.2 = 1.92
1 1 1
𝜇𝑌 = ∑ 𝑦 𝑗 𝑃(𝑦 𝑗 ) = (1 × ) + (3 × ) + (9 × ) = 3
2 3 6
E[𝑋𝑌] = ∑𝑖 ∑𝑗 𝑃𝑖𝑗 𝑥 𝑖 𝑦 𝑗
1 1 1 1 1 1
= (2 × 8) + (6 × 24) + (18 × 12) + (4 × 4) + (12 × 4) + 36 × 0 + (6 × 8) +
1 1
(18 × 24) + (54 × 12)
= 2 + 4 + 6 = 12
𝐶𝑜𝑣 (𝑋, 𝑌) = 𝐸[𝑋𝑌] − 𝜇𝑋 𝜇𝑌 = 12 − 12 = 0
𝜌(𝑋, 𝑌) = 0.
7. For the following bivariate probability distribution of X and Y find
(i) 𝑃(𝑋 ≤ 1, 𝑌 = 2) (ii) 𝑃(𝑋 ≤ 1) (iii) 𝑃(𝑌 = 3) (iv) 𝑃(𝑌 ≤ 3) (v) 𝑃(𝑋 < 3, 𝑌 ≤ 4)
Y
1 2 3 4 5 6
X
0 0 0 1/32 2/32 2/32 3/32
1 1/16 1/16 1/8 1/8 1/8 1/8
2 1/32 1/32 1/64 1/64 0 2/64
Solution:
Y P(X)
1 2 3 4 5 6
X
0 0 0 1/32 2/32 2/32 3/32 8/32
1 1/16 1/16 1/8 1/8 1/8 1/8 10/16
2 1/32 1/32 1/64 1/64 0 2/64 8/64
P(Y) 3/32 3/32 11/64 13/64 6/32 16/64 1
1 1
𝑃(𝑋 ≤ 1, 𝑌 = 2) = 𝑃(𝑋 = 0, 𝑌 = 2) + 𝑃(𝑋 = 1, 𝑌 = 2) = 0 + =
16 16
8 10 7
𝑃(𝑋 ≤ 1) = 𝑃(𝑋 = 0) + 𝑃(𝑋 = 1) = + =
32 16 8
11
𝑃(𝑌 = 3) =
64
3 3 11 23
𝑃(𝑌 ≤ 3) = + + =
32 32 64 64
1 2 1 1 1 1 1 1 1 1 9
𝑃(𝑋 < 3, 𝑌 ≤ 4) = (0 + 0 + + )+( + + + )+( + + + )=
32 32 16 16 8 8 32 32 64 64 16
8. For the following bivariate probability distribution, find the value of k.
Y
1 2 3
X
-5 0 0.1 0.1
0 0.1 k 0.2
5 0.2 0.1 0
Solution:
Y P(X)
1 2 3
X
-5 0 0.1 0.1 0.2
0 0.1 k 0.2 0.3+k
5 0.2 0.1 0 0.3
P(Y) 0.3 0.2+k 0.3 1
0.8 + 𝑘 = 1
𝑘 = 0.2
Baye’s theorem
If B1, B2, …, Bn are mutually disjoint events with 𝑃(𝐵𝑖) ≠ 0 (𝑖 = 1,2, … , 𝑛) then for any arbitrary
event A which is a subset of ⋃𝑛𝑖=1 𝐵𝑖 such that 𝑃(𝐴) > 0, then
𝑃(𝐵𝑖 )𝑃(𝐴|𝐵𝑖 )
𝑃(𝐵𝑖 |𝐴) = 𝑛 .
∑𝑖=1 𝑃(𝐵𝑖 )𝑃(𝐴|𝐵𝑖 )
1. In a certain assembly plant, three machines, B1, B2, and B3, make 30%, 45%, and 25%,
respectively, of the products. It is known from past experience that 2%, 3%, and 2% of the products
made by each machine, respectively, are defective. Now, suppose that a finished product is randomly
selected. What is the probability that it is defective? What is the probability that it was made by
machine B3?
Solution: Consider the following events:
A: the product is defective,
B1: the product is made by machine B1,
B2: the product is made by machine B2,
B3: the product is made by machine B3.
Using Baye’s rule, the probability that the defective product was made by machine B3 is
𝑃(𝐵3)𝑃(𝐴|𝐵3)
𝑃(𝐵3|𝐴) =
𝑃(B1)P(A|B1) + P(B2)P(A|B2) + P(B3)P(A|B3)
0.005
= = 0.204
0.0245
2. An office has 4 secretaries handling respectively 20%, 60%, 15% and 5% of files of all government
reports. The probability that they misfile such reports are respectively 0.05, 0.1, 0.1 and 0.05. Find the
probability that the misfiled report can be blamed on the first secretary.
Solution: Let 𝐴1 , 𝐴2 , 𝐴3 and 𝐴4 be 4 secretaries of the office respectively handling 20%, 60%, 15%
and 5% of these files. Hence we have
𝑃(𝐴1 ) = 0.2, 𝑃(𝐴2 ) = 0.6, 𝑃(𝐴3 ) = 0.15 and 𝑃(𝐴4 ) = 0.05
𝑃(𝐴1 )𝑃(𝐸|𝐴1 )
𝑃(𝐴1 |𝐸) =
𝑃(𝐸)
0.01
𝑃(𝐴1 |𝐸) = = 0.1142
0.0875
3. One factory F1 produces 1000 articles of which 20 are defective, second factory F2 produces 4000
articles of which 40 are defective and the third factory F3 produces 5000 articles of which 50 are
defective. All these articles are put in one stock file. If one of them is chosen, what is the probability
that it is defective? If a chosen item is defective, what is the probability that it is from factory F1?
Since there are 10,000 articles in all (produced by F1, F2 and F3 taken together), we have
1000 4000 5000
𝑃(𝐹1 ) = 10000 = 0.1 , 𝑃(𝐹2 ) = 10000 = 0.4, 𝑃(𝐹3 ) = 10000 = 0.5
20 40 50
𝑃(𝐸|𝐹1 ) = 1000 = 0.02, 𝑃(𝐸|𝐹2 ) = 4000 = 0.01, 𝑃(𝐸|𝐹3 ) = 5000 = 0.01
Therefore, the probability of selecting a defective article from the lot is,
𝑃(𝐸) = 𝑃(𝐹1 )𝑃(𝐸|𝐹1 ) + 𝑃(𝐹2 )𝑃(𝐸|𝐹2 ) + 𝑃(𝐹3 )𝑃(𝐸|𝐹3 )
𝑃(𝐸) = 0.002 + 0.004 + 0.005
𝑃(𝐸) = 0.011
Using Baye’s rule, the probability that the disease was correctly diagnosed is
5. A shipment of components consists of 3 identical boxes. The first box contains 200 components of
which 25% are defective, the second box has 5000 components of which 20% are defective and the
third box contains 2000 components of which 600 are defective. A box is selected at random and a
component is removed at random from this box. What is the probability that this component is
defective? If the component drawn is found to be defective, what is the probability that it came from
the first or second box?
Solution: Let A , B, C are the first second and third box respectively.
Let P(A), P(B) and P(C) be the probabilities of choosing the first, second and third boxes respectively.
1 1 1
𝑃(𝐴) = , 𝑃(𝐵) = , 𝑃(𝐶) =
3 3 3
E be the event of defective component
Let 𝑃(𝐸|𝐴), 𝑃(𝐸|𝐵) and 𝑃(𝐸|𝐶) denote the probabilities of choosing a defective item from the first,
second and third boxes respectively
600
𝑃(𝐸|𝐴) = 0.25, 𝑃(𝐸|𝐵) = 0.2 and 𝑃(𝐸|𝐶) = 2000 = 0.3
Therefore, the probability of drawing a defective component from an arbitrarily chosen box is
𝑃(𝐸) = 𝑃(𝐴)𝑃(𝐸|𝐴) + 𝑃(𝐵)𝑃(𝐸|𝐵) + 𝑃(𝐶)𝑃(𝐸|𝐶)
1
𝑃(𝐸) = 3 [0.25 + 0.2 + 0.3] = 0.25
The probability that a drawn component is defective and it is from the first box is
1
𝑃(𝐴)𝑃(𝐸|𝐴) 3 × 0.25 1
𝑃(𝐴|𝐸) = = =
𝑃(𝐸) 0.25 3
The probability that a drawn component is defective and it is from the second box is
1
𝑃(𝐵)𝑃(𝐸|𝐵) 3 × 0.2
𝑃(𝐵|𝐸) = = = 0.2667
𝑃(𝐸) 0.25
Consequently, the probability that a drawn component is defective and it is from the first or second
box is
𝑃(𝐴|𝐸) + 𝑃(𝐵|𝐸) = 0.3333 + 0.2667 = 0.6.
6. A company manufacturing ball pens in two writing colours blue and red make packets of 10 pens
with 5 pens of each colour. In a particular shop it was found that after sales, packet 1 contained 3 blue
and 2 red pens, packet 2 contained 2 blue and 3 red pens, packet 3 contained 3 blue and 5 red pens.
On the demand of a customer for a pen, packet was drawn at random and a pen was taken out. It was
found blue. Find the probability that packet 1 was selected.
Solution: Let 𝑃1 , 𝑃2 and 𝑃3 be the event of selecting packets 1, 2 and 3 respectively at random.
1
∴ 𝑃(𝑃1 ) = 𝑃(𝑃2 ) = 𝑃(𝑃3 ) = 3
The joint probability density function (pdf) of two continuous random variable (X,Y) is defined as a
function 𝑓(𝑥, 𝑦) satisfying the following conditions:
∞
The function 𝑝2 (𝑦) = ℎ(𝑦) = 𝑓𝑌 (𝑦) = ∫−∞ 𝑓(𝑥, 𝑦)𝑑𝑥 is called the marginal density function of Y.
𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌)
Mean, Variance and Covariance
∞
𝐸(𝑋) = ∫ 𝑥 𝑝1 (𝑥)𝑑𝑥
−∞
∞
𝐸(𝑌) = ∫ 𝑦 𝑝2 (𝑦)𝑑𝑦
−∞
∞ ∞
𝐸(𝑋𝑌) = ∫ ∫ 𝑥𝑦 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦
−∞ −∞
∞ ∞
𝐸(𝑋 + 𝑌) = ∫ ∫ (𝑥 + 𝑦) 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦
−∞ −∞
𝜎𝑌 = √𝐸(𝑌 2 ) − [𝐸(𝑌)]2
1. Let (X,Y) be continuous random variable with Joint PDF given by
𝑥𝑦
𝑥2 + , 0 ≤ 𝑥 ≤ 1, 0≤𝑦≤2
f(x, y) = { 3
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Is 𝑓(𝑥, 𝑦) a probability density function? (ii) Marginal pdf of X and Y.
1 2
𝑥𝑦
∫ ∫ (𝑥 2 + ) 𝑑𝑦 𝑑𝑥
𝑥=0 𝑦=0 3
1
𝑥𝑦 2
= ∫ (𝑥 2 𝑦 + ) 𝑑𝑥 𝑏𝑒𝑡𝑤𝑒𝑒𝑛 0 𝑡𝑜 2
0 6
1
4𝑥
= ∫ (2𝑥 2 + ) 𝑑𝑥
0 6
2𝑥 3 4𝑥 2
= + 0 𝑡𝑜 1
3 12
=1
Therefore 𝑓(𝑥, 𝑦) is a probability density function.
Marginal density of X
∞
𝑝1 (𝑥) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦
−∞
2
𝑥𝑦
𝑝1 (𝑥) = ∫ (𝑥 2 + ) 𝑑𝑦
0 3
𝑥𝑦 2
𝑝1 (𝑥) = 𝑥 2 𝑦 + 0 𝑡𝑜 2
6
2𝑥
𝑝1 (𝑥) = 2𝑥 2 + , 𝑜≤𝑥≤1
3
Marginal density of Y
∞
𝑝2 (𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥
−∞
1
𝑥𝑦
𝑝2 (𝑦) = ∫ (𝑥 2 + ) 𝑑𝑥
0 3
1 𝑦
𝑝2 (𝑦) = + , 0≤𝑦≤2
3 6
2. Find the constant ‘𝑘’ so that
𝑘(𝑥 + 1)𝑒 −𝑦 , 0 < 𝑥 < 1, 𝑦 > 0
h(x, y) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
is a joint probability density function. Are X and Y independent?
∞ ∞ ∞ 1
∫ ∫ ℎ(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = ∫ ∫ ℎ(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞ 𝑦=0 𝑥=0
1 ∞
= 𝑘 {∫ (𝑥 + 1)𝑑𝑥} {∫ 𝑒 −𝑦 𝑑𝑦}
0 0
3 3
1 = 𝑘 {2} {0 + 1} = 𝑘.
2
2
⇒𝑘=
3
∞ ∞ 2
Hence ∫−∞ ∫−∞ ℎ(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = 1 if k = 3.
2
Therefore, ℎ(𝑥, 𝑦) is a joint probability density function if k = 3.
2
With k = 3, the marginal density functions are
∞
𝑝1 (𝑥) = ∫ ℎ(𝑥, 𝑦) 𝑑𝑦, 0<𝑥<1
−∞
∞
2
= (𝑥 + 1) ∫ 𝑒 −𝑦 𝑑𝑦
3 0
2
= (𝑥 + 1)(0 + 1).
3
2
𝑝1 (𝑥) = (𝑥 + 1), 0 < 𝑥 < 1
3
∞
𝑝2 (𝑦) = ∫ ℎ(𝑥, 𝑦) 𝑑𝑥, 𝑦 > 0
−∞
2 −𝑦 1 2 −𝑦 22 1 2 3
𝑝2 (𝑦) = 𝑒 ∫ (𝑥 + 1) 𝑑𝑥 = 𝑒 { − } = 𝑒 −𝑦
3 0 3 2 2 3 2
𝑝2 (𝑦) = 𝑒 −𝑦 , 𝑦 > 0.
Therefore, 𝑝1 (𝑥)𝑝2 (𝑦) = ℎ(𝑥, 𝑦) and hence 𝑋 and Y are stochastically independent.
Solution:
∞
𝑝1 (𝑥) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦
−∞
1
𝑝1 (𝑥) = ∫ 4𝑥𝑦𝑑𝑦 = 2𝑥
0
∞
𝑝2 (𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥
−∞
1
𝑝2 (𝑦) = ∫ 4𝑥𝑦𝑑𝑥 = 2𝑦
0
∞ 1
2
𝐸(𝑋) = ∫ 𝑥𝑝1 (𝑥)𝑑𝑥 = ∫ 2𝑥 2 𝑑𝑥 =
−∞ 0 3
∞ 1
2
𝐸(𝑌) = ∫ 𝑦𝑝2 (𝑦)𝑑𝑦 = ∫ 2𝑦 2 𝑑𝑦 =
−∞ 0 3
∞ ∞
𝐸(𝑋 + 𝑌) = ∫ ∫ (𝑥 + 𝑦)𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞
1 1
𝐸(𝑋 + 𝑌) = ∫ ∫ (𝑥 + 𝑦)4𝑥𝑦 𝑑𝑥 𝑑𝑦
0 0
1 1
𝐸(𝑋 + 𝑌) = ∫ ∫ (4𝑥2 𝑦 + 4𝑥𝑦2 ) 𝑑𝑥 𝑑𝑦
0 0
1
4𝑦
𝐸(𝑋 + 𝑌) = ∫ ( + 2𝑦 2 ) 𝑑𝑦
0 3
4𝑦 2 2𝑦 3
𝐸(𝑋 + 𝑌) = +
6 3
2 2
𝐸(𝑋 + 𝑌) = +
3 3
Hence 𝐸(𝑋 + 𝑌) = 𝐸(𝑋) + 𝐸(𝑌)
∞ ∞
𝐸(𝑋𝑌) = ∫ ∫ (𝑥𝑦)𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞
1 1
𝐸(𝑋𝑌) = ∫ ∫ (𝑥𝑦)4𝑥𝑦 𝑑𝑥 𝑑𝑦
0 0
1 1
𝐸(𝑋𝑌) = ∫ ∫ 4𝑥2 𝑦2 𝑑𝑥 𝑑𝑦
0 0
1 4𝑦2
𝐸(𝑋𝑌) = ∫ 𝑑𝑦
0 3
4
𝐸(𝑋𝑌) =
9
Hence 𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌)
Therefore X and Y are independent.
2 3 𝑥𝑦 5
(i)𝑃(1 < 𝑥 < 2, 2 < 𝑦 < 3) = ∫𝑥=1 ∫𝑦=2 96 𝑑𝑦 𝑑𝑥 = 128
4 2 𝑥𝑦 7
(ii) 𝑃(𝑥 ≥ 3, 𝑦 ≤ 2) = ∫𝑥=3 ∫𝑦=1 96 𝑑𝑦 𝑑𝑥 = 128
4 𝑥 𝑥𝑦
(iii) 𝑃(𝑦 ≤ 𝑥) = ∫𝑥=1 ∫𝑦=1 96 𝑑𝑦 𝑑𝑥
4
𝑥𝑦 2
𝑃(𝑦 ≤ 𝑥) = ∫ 𝑑𝑥 𝑏𝑒𝑡𝑤𝑒𝑒𝑛 1 𝑡𝑜 𝑥
𝑥=1 192
4
1
𝑃(𝑦 ≤ 𝑥) = ∫ (𝑥 3 − 𝑥)𝑑𝑥
192 𝑥=1
75
𝑃(𝑦 ≤ 𝑥) =
256
75 181
(iv) 𝑃(𝑦 > 𝑥) = 1 − 𝑃(𝑦 ≤ 𝑥) = 1 − = 256
256
2 3−𝑥 𝑥𝑦
(v) 𝑃(𝑥 + 𝑦 ≤ 3) = ∫𝑥=0 ∫𝑦=1 𝑑𝑦 𝑑𝑥
96
2
𝑥𝑦 2
𝑃(𝑥 + 𝑦 ≤ 3) = ∫ 𝑑𝑥 𝑏𝑒𝑡𝑤𝑒𝑒𝑛 1 𝑡𝑜 3 − 𝑥
𝑥=0 192
2
𝑃(𝑥 + 𝑦 ≤ 3) = ∫ [𝑥(3 − 𝑥)2 − 𝑥]𝑑𝑥
𝑥=0
2
𝑃(𝑥 + 𝑦 ≤ 3) = ∫ [𝑥(9 + 𝑥 2 − 6𝑥) − 𝑥]𝑑𝑥
𝑥=0
2
𝑃(𝑥 + 𝑦 ≤ 3) = ∫ [8𝑥 + 𝑥 3 − 6𝑥 2 ]𝑑𝑥
𝑥=0
1
𝑃(𝑥 + 𝑦 ≤ 3) =
48
1 47
(vi) 𝑃(𝑥 + 𝑦 > 3) = 1 − 𝑃(𝑥 + 𝑦 ≤ 3) = 1 − 48 = 48
𝑒 −(𝑥+𝑦) , 𝑥 ≥ 0, 𝑦 ≥ 0
5. Verify that f (x, y) = { is a density function of a joint probability
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
distribution. Then evaluate the following:
1
(i) 𝑃 (2 < 𝑥 < 2, 0 < 𝑦 < 4) (ii) 𝑃(𝑥 < 1) (iii) 𝑃(𝑥 > 𝑦) (iv) 𝑃(𝑥 + 𝑦 ≤ 1).
∞ ∞ ∞ ∞ ∞ ∞
𝑓(𝑥, 𝑦) = ∫ ∫ 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = ∫ ∫ 𝑒 −(𝑥+𝑦) 𝑑𝑥 𝑑𝑦 = ∫ 𝑒 −𝑥 𝑑𝑥 ∫ 𝑒 −𝑦 𝑑𝑦
−∞ −∞ −∞ −∞ −∞ 0
= (0 + 1)(0 + 1) = 1.
Therefore, 𝑓(𝑥, 𝑦) is a density function.
1 2 4 2 4
(i) 𝑃 (2 < 𝑥 < 2, 0 < 𝑦 < 4) = ∫1/2 ∫0 𝑓(𝑥, 𝑦) 𝑑𝑦 𝑑𝑥 = ∫1/2 ∫0 𝑒 −(𝑥+𝑦) 𝑑𝑦 𝑑𝑥
2 4
= ∫1/2 𝑒 −𝑥 𝑑𝑥 ∫0 𝑒 −𝑦 𝑑𝑦 = (𝑒 −1/2 − 𝑒 −2 )(1 − 𝑒 −4 ).
∞ 𝑦 ∞ 𝑦
(iii) 𝑃(𝑥 ≤ 𝑦) = ∫0 {∫0 𝑓(𝑥, 𝑦) 𝑑𝑥} 𝑑𝑦 = ∫0 {∫0 𝑒 −(𝑥+𝑦) 𝑑𝑥} 𝑑𝑦
∞ 𝑦 ∞
= ∫ 𝑒 −𝑦 (∫ 𝑒 −𝑥 𝑑𝑥) 𝑑𝑦 = ∫ 𝑒 −𝑦 (1 − 𝑒 −𝑦 ) 𝑑𝑦
0 0 0
∞
1 1
= ∫ (𝑒 −𝑦 − 𝑒 −2𝑦 ) 𝑑𝑦 = 1 − =
0 2 2
1 1
Therefore, 𝑃(𝑥 > 𝑦) = 1 − 𝑃(𝑥 ≤ 𝑦) = 1 − = .
2 2
1 1−𝑥 1 1−𝑥
= ∫ ∫ 𝑓(𝑥, 𝑦) 𝑑𝑦 𝑑𝑥 = ∫ {∫ 𝑒 −(𝑥+𝑦) 𝑑𝑦} 𝑑𝑥
𝑥=0 𝑦= 0 0 0
1 1−𝑥 1
=∫ 𝑒 −𝑥
{∫ 𝑒 −𝑦
𝑑𝑦} 𝑑𝑥 = ∫ 𝑒 −𝑥 {1 − 𝑒 − (1−𝑥) } 𝑑𝑥
0 0 0
1
2
= ∫ (𝑒 −𝑥 − 𝑒 −1 ) 𝑑𝑥 = 1 − .
0 𝑒
Random variable
Random variable is a real number X connected with the outcome of a random experiment E. For
example, if E consists of three tosses of a coin, one can consider the random variable which is the
number of heads (0, 1, 2 or 3).
Value of X 3 2 2 2 1 1 1 0
Let S denote the sample space of a random experiment. A random variable means it is a rule which
assigns a numerical value to each and every outcome of the experiment. Thus, random variable is a
function X (ω) with domain S and range (-∞, ∞) such that for every real number a, the event [ω: X (ω)
≤ a ] ϵ B the field of subsets in S. It is denoted as f: S → R.
Note that all the outcomes of the experiment are associated with a unique number. Therefore, f is an
example of a random variable. Usually, a random variable is denoted by letters such as X, Y, Z etc. The
image set of the random variable may be written as f(S) = {0, 1, 2, 3}.
Discrete Random Variable: A discrete random variable is one which takes only a countable number
of distinct values such as 0, 1, 2, 3, … . Discrete random variables are usually (but not necessarily)
counts. If a random variable takes at most a countable number of values, it is called a discrete random
variable. In other words, a real valued function defined on a discrete sample space is called a discrete
random variable.
(i) In the experiment of throwing a die, define X as the number that is obtained. Then X takes any
of the values 1 – 6. Thus, X(S) = {1, 2, 3…6} which is a finite set and hence X is a DRV.
(ii) If X be the random variable denoting the number of marks scored by a student in a subject of
an examination, then X(S) = {0, 1, 2, 3,…100}. Then, X is a DRV.
(iii) The number of children in a family is a DRV.
(iv) The number of defective light bulbs in a box of ten is a DRV.
Probability Mass Function: Suppose X is a one-dimensional discrete random variable taking at most
a countably infinite number of values x1, x2, …. With each possible outcome xi, one can associate a
number pi = P(X = xi) = p(xi), called the probability of xi.
(i) 𝑝(𝑥𝑖 ) ≥ 0 ∀ 𝑖 ,
(ii) ∑∞
𝑖=1 𝑝(𝑥𝑖 ) = 1 .
This function 𝑝 is called the probability mass function of the random variable X and the set
{𝑥𝑖 , 𝑝(𝑥𝑖 )} is called the probability distribution of the random variable X.
Continuous Random Variable: A continuous random variable is not defined at specific values.
Instead, it is defined over an interval of values, and is represented by the area under a curve. Thus, a
random variable X is said to be continuous if it can take all possible values between certain limits. In
other words, a random variable is said to be continuous when its different values cannot be put in 1-1
correspondence with a set of positive integers. Here, the probability of observing any single value is
equal to zero, since the number of values which may be assumed by the random variable is infinite.
A continuous random variable is a random variable that (at least conceptually) can be measured
to any desired degree of accuracy.
Important Remark: In case of DRV, the probability at a point i.e., P (x = c) is not zero for some
fixed c. However, in case of CRV the probability at a point is always zero.
Probability Density Function: The probability density function (p.d.f) of a random variable X usually
denoted by 𝑓𝑥 (𝑥) or simply by 𝑓(𝑥) has the following obvious properties:
𝑏
i.e., 𝑃(𝑋 ∈ 𝐴) = ∫𝑎 𝑓(𝑥)𝑑𝑥
The variance of a DRV X is defined as 𝜎 2 = 𝑉𝑎𝑟(𝑋) = 𝐸[𝑋 − 𝐸(𝑋)]2 = ∑𝑛𝑖=1(𝑥 − 𝜇)2 𝑝(𝑥)
∞
The mean or expected value of a CRV X is defined as 𝜇 = 𝐸(𝑋) = ∫−∞ 𝑥 𝑓(𝑥)𝑑𝑥
∞
The variance of a CRV X is defined as 𝑉𝑎𝑟(𝑋) = 𝜎 2 = ∫−∞ 𝑥 2 𝑓(𝑥)𝑑𝑥 − 𝜇2
Properties:
𝐸(𝑎) = 𝑎
𝐸(𝑎𝑋 + 𝑏) = 𝐸(𝑎𝑋) + 𝐸(𝑏) = 𝑎𝐸(𝑋) + 𝑏
𝐸(𝑋 + 𝑌) = 𝐸(𝑋) + 𝐸(𝑌)
𝑉(𝑎) = 0
𝑉(𝑎𝑋) = 𝑎2 𝑉(𝑋)
𝑉(𝑎𝑋 + 𝑏) = 𝑎2 𝑉(𝑋)
1. For the following distribution find 𝐸(𝑋), 𝐸(2𝑋 + 4), 𝐸(𝑋 2 + 2), 𝑉(𝑋), 𝑆𝐷(𝑋), 𝑉(2𝑋).
x -2 0 1 2
𝑝(𝑥) 0.2 0.4 0.3 0.1
Solution:
𝐸(𝑋) = ∑𝑥𝑝(𝑥) = (−2 × 0.2) + (0 × 0.4) + (1 × 0.3) + (2 × 0.1) = 0.1
𝐸(𝑋 2 ) = ∑𝑥 2 𝑝(𝑥)
𝐸(𝑋 2 ) = ((−2)2 × 0.2) + (02 × 0.4) + (12 × 0.3) + (22 × 0.1) = 1.5
2. Find the missing probability in the following distribution then compute E(X) and V(X)
x 2 4 6 8 10
𝑝(𝑥) 1/8 1/6 A 1/4 1/12
3. A person throws a fair coin, He gets Rs 100/- if head appears and loses Rs. 50/- if tail appears. Find
the expected gain and SD(X).
Solution: Let X be the amount he gets.
The probability distribution is given by
x 100 -50
p(x) 1/2 1/2
𝐸(𝑋) = ∑𝑥𝑝(𝑥) = 50 − 25 = 25
1 1
𝐸(𝑋 2 ) = (1002 × ) + (502 × ) = 6250
2 2
𝑉(𝑋) = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2 = 6250 − 252 = 5625
𝑆𝐷(𝑋) = √5625 = 75
4. A bag contains 5 red and 3 blue marbles. A person draws 2 marbles at random. If he is to receive
Rs 5 for every red marble he draws and Rs 8 for every blue marble he draws. What is his expectation?
Solution: Let X be the amount he gets.
Given 𝑆 = {5𝑅, 3𝐵}
5𝐶 10
P[X=10]=P[both red]= 8𝐶2 = 28
2
5𝐶1 ×3𝐶1 15
P[X=13]=P[One red and One blue] = 8𝐶2
= 28
3𝐶2 3
P[X=16]=P[both blue]= =
8𝐶2 28
4𝐶1 ×8𝐶2 28
P[X=1]=P[1 defective]= =
12𝐶3 55
4𝐶2 ×8𝐶1 12
P[X=2]=P[2 defective]= 12𝐶3
= 55
4𝐶 1
P[X=3]=P[3 defective]=12𝐶3 = 55
3
x 0 1 2 3
p(x) 14/55 28/55 12/55 1/55
28 24 3 55
𝐸(𝑋) = + + = =1
55 55 55 55
6. A man wins if he gets 5 on a single throw of a die. He loses if gets 2 or 4. If he wins, he gets Rs 50.
If he loses, he gets Rs 10, otherwise he has to pay Rs 15. Find his expected gain.
Solution: let 𝑆 = {1,2,3,4,5,6}
Let X be the amount he gets
1
P[X=50]=P[he gets 5]=6
1 1 2
P[X=10]=P[he gets 2 or 4]= + =
6 6 6
1 1 1 3
P[X=-15]=P[he gets 1 or 3 or 6]=6 + 6 + 6 = 6
The probability distribution is given by
x 50 10 -15
p(x) 1/6 2/6 3/6
1 2 3
𝐸(𝑋) = (50 × ) + (10 × ) + (−15 × ) = 4.1666
6 6 6
x 0 1 2 3 4 5 6
P(X = x) = p(x) k 3k 5k 7k 9k 11k 13k
Find (i) k (ii) 𝑃(𝑋 ≥ 5); (iii) 𝑃(2 ≤ 𝑋 < 5); (iv) 𝑃(𝑋 < 5) and (v) E(X) (vi) Var(X)
Solution: To find the value of k, consider the sum of all the probabilities which equals to 49k. Equating
this to 1,
x 0 1 2 3 4 5 6
P(X = x) = p(x) 1/49 3/49 5/49 7/49 9/49 11/49 13/49
11 13 24
𝑃[𝑋 ≥ 5] = 𝑃[𝑋 = 5] + 𝑃[𝑋 = 6] = + = .
49 49 49
5 7 9 21
𝑃[2 ≤ 𝑋 < 5] = 𝑃[𝑋 = 2] + 𝑃[𝑋 = 3] + 𝑃[𝑋 = 4] = + + = .
49 49 49 49
25
𝑃(𝑋 < 5) = 𝑃(𝑋 = 0) + 𝑃(𝑋 = 1) + 𝑃[𝑋 = 2] + 𝑃[𝑋 = 3] + 𝑃[𝑋 = 4] =
49
OR
24 25
𝑃(𝑋 < 5) = 1 − 𝑃(𝑋 ≥ 5) = 1 − =
49 49
x 0 1 2 3 4 5 6 7
P(X = x) =p(x) 0 k 2k 2k 3k 𝑘 2 2𝑘 2 7𝑘 2 + 𝑘
Find (i) k; (ii) 𝑃(𝑋 < 6) (iii) 𝑃(𝑋 ≥ 6); (iv) 𝑃(0 < 𝑋 < 5); (v) E(X)
Solution: ∑𝑝(𝑥) = 1
10𝑘 2 + 9𝑘 = 1
𝑘 = 0.1, −1
𝑘 = 0.1
OR
9. A random variable 𝑋 has 𝑝(𝑥) = 2−𝑥 , 𝑥 = 1,2,3, ⋯ Show that p(x) is a probability function. Also
find
(i) 𝑝(𝑋 𝑖𝑠 𝑒𝑣𝑒𝑛) (ii) 𝑝(𝑋 𝑖𝑠 𝑑𝑖𝑣𝑖𝑠𝑖𝑏𝑙𝑒 𝑏𝑦 3) (iii)𝑃(𝑋 ≥ 5).
1
Solution: 𝑝(𝑥) = 2−𝑥 = , 𝑃(𝑥) ≥ 0 for all x.
2𝑥
∞
1 1 1
∑𝑝(𝑥) = ∑ 2−𝑥 = + 2+ 3+⋯
2 2 2
𝑥=1
∞ 1
1 1 1 𝑎
∑𝑝(𝑥) = ∑ 2 −𝑥
= + 2+ 3+⋯= = 2 =1
2 2 2 1−𝑟 1−1
𝑥=1 2
Hence 𝑝(𝑥) is a probability function.
∞ 1
−𝑥
1 1 1 2 2 1
𝑃(𝑋 𝑖𝑠 𝑒𝑣𝑒𝑛) = ∑ 2 = 2+ 4+ 6+⋯= =
2 2 2 1
𝑥=2,4,6 1− 2 3
2
∞ 1
1 1 1 3 1
𝑃(𝑋 𝑖𝑠 𝑑𝑖𝑣𝑖𝑠𝑖𝑏𝑙𝑒 𝑏𝑦 3) = ∑ 2−𝑥 = 3+ 6+ 9+⋯= 2 =
2 2 2 1
𝑥=3,6,9,… 1− 3 7
2
∞ 1
−𝑥
1 1 1 25 1
𝑃(𝑋 ≥ 5) = ∑ 2 = 5+ 6+ 7+⋯= =
2 2 2 1 16
𝑥=5,6,7,… 1−
2
OR
1 1 1 1 15 1
𝑃(𝑋 ≥ 5) = 1 − 𝑃(𝑋 < 5) = 1 − ( + 2 + 3 + 4 ) = 1 − =
2 2 2 2 16 16
Problems on continuous random variable.
1. The diameter of an electric cable, say X, is assumed to be a continuous random variable with p.d.f
6𝑥(1 − 𝑥) 0≤𝑥≤1
𝑓(𝑥) = {
0 otherwise
(i) Check that above is p.d.f.
2
(ii) Find 𝑃 (3 < 𝑥 < 1)
(iii) Determine a number b such that 𝑃(𝑋 < 𝑏) = 𝑃(𝑋 > 𝑏).
∞ 0 1 ∞
∫ 𝑓(𝑥)𝑑𝑥 = ∫ 𝑓(𝑥)𝑑𝑥 + ∫ 𝑓(𝑥)𝑑𝑥 + ∫ 𝑓(𝑥)𝑑𝑥
−∞ −∞ 0 1
1
= 0 + ∫0 6𝑥(1 − 𝑥) 𝑑𝑥 + 0
6𝑥 2 6𝑥 3
={ 2
− 3
} by putting limits x = 0 to 1
=1
2 6𝑥 2 6𝑥 3
2 1 1
(ii) 𝑃 ( < 𝑥 < 1) = ∫2/3 𝑓(𝑥)𝑑𝑥 = ∫2/3 (6𝑥 − 6𝑥 )𝑑𝑥 = { 2
− 3
} = 3𝑥 2 − 2𝑥 3
3
.
3×4 2×8 4 16 7
=1−( − )=1−( − )=
9 27 3 27 27
𝑏 ∞
∫−∞ 𝑓(𝑥)𝑑𝑥 = ∫𝑏 𝑓(𝑥)𝑑𝑥
𝑏 1
∫0 𝑓(𝑥)𝑑𝑥 = ∫𝑏 𝑓(𝑥)𝑑𝑥
𝑏 1
6 ∫0 𝑥(1 − 𝑥)𝑑𝑥 = 6 ∫𝑏 𝑥(1 − 𝑥)𝑑𝑥
𝑏2 𝑏3 1 1 𝑏2 𝑏3
(2 − 3
) = [(2 − 3) − ( 2 − 3
)]
2
2𝑏 3 1
𝑏 − =
3 6
4𝑏 3 − 6𝑏 2 + 1 = 0
𝑏 = 0.5,1.3666, −0.3666
From this 𝑏 = 0.5 is the only real value lying between 0 and 1 and satisfying the given condition.
2. Suppose that the error in the reaction temperature, in ◦C, for a controlled laboratory
experiment is a continuous random variable X having the probability density function
𝑥2
𝑓(𝑥) = { 3 , −1 < 𝑥 < 2
0 , elsewhere
∞ 2 𝑥2
∫−∞ 𝑓(𝑥)𝑑𝑥 = ∫−1 3
𝑑𝑥 = 1. Hence the given function is a p.d.f.
1 𝑥2 1
b) P(0 < X ≤ 1) = ∫0 𝑑𝑥 = .
3 9
3. The length of time (in minutes) that a certain lady speaks on telephone is found to be a random
𝑥
−
variable with probability function 𝑓(𝑥) = {𝐴𝑒 5 𝑓𝑜𝑟 𝑥 ≥ 0
0 otherwise
(i) Find A
(ii) Find the probability that she will speak on the phone
(a) more than 10 minutes (b) less than 5 minutes (c) between 5 & 10 minutes.
∞
Solution: (i) Given f (x) is p.d.f. i.e., ∫−∞ 𝑓(𝑥)𝑑𝑥 = 1
0 ∞
∫−∞ 𝑓(𝑥)𝑑𝑥 + ∫0 𝑓(𝑥)𝑑𝑥 = 1
𝑦𝑖𝑒𝑙𝑑𝑠 𝑥
∞
→ 0 + ∫0 𝐴𝑒 −5 𝑑𝑥 = 1
𝑥
−
𝐴𝑒 5
1 = 1 between 0 to infinity
−
5
−5𝐴(0 − 1) = 1
5𝐴 = 1
1
𝐴=5
∞ ∞ 1 −𝑥 𝑥
(ii) (a) 𝑃(𝑥 > 10) = ∫10 𝑓(𝑥)𝑑𝑥 = ∫10 5 𝑒 5 𝑑𝑥 = −𝑒 −5 between 10 to ∞
∞ 1 −𝑥
. = ∫10 𝑒 5 𝑑𝑥 = 𝑒 −2 = 0.1353
5
5 5 1 −𝑥 𝑥
(b) 𝑃(𝑥 < 5) = ∫−∞ 𝑓(𝑥)𝑑𝑥 = ∫0 5 𝑒 5 𝑑𝑥 = −𝑒 −5 = −(𝑒 −1 − 1)
= −𝑒 −1 + 1 = 0.6322
10 10 1 −𝑥
(c) 𝑃(5 < 𝑥 < 10) = ∫5 𝑓(𝑥)𝑑𝑥 = ∫5 5
𝑒 5 𝑑𝑥 .
= −𝑒 −2 + 𝑒 −1 = 0.2325
4. Suppose X is a continuous random variable with the following probability density function
𝑓(𝑥) = 3𝑥 2 for 0 < 𝑥 < 1 . Find the mean and variance of X.
∞
Solution: Mean = 𝐸(𝑋) = 𝜇 = ∫−∞ 𝑥𝑓(𝑥)𝑑𝑥
0 1 ∞
= ∫−∞ 𝑥𝑓(𝑥)𝑑𝑥 + ∫0 𝑥𝑓(𝑥)𝑑𝑥 + ∫1 𝑥𝑓(𝑥)𝑑𝑥
1 1 3
= 0 + ∫0 (𝑥 × 3𝑥 2 )𝑑𝑥 + 0 = ∫0 3𝑥 3 𝑑𝑥 = .
4
∞
Variance = 𝜎 2 = ∫−∞ 𝑥 2 𝑓(𝑥)𝑑𝑥 − [𝐸(𝑋)]2
1
= ∫0 𝑥 2 𝑓(𝑥)𝑑𝑥 − [𝐸(𝑋)]2
1 3 2
= ∫0 (𝑥 2 × 3𝑥 2 )𝑑𝑥 − (4)
1 3 2 3 3
= ∫0 3𝑥 4 𝑑𝑥 − (4) = 5 − 0.752 = 80
= 0.0375 .
5. X is a random variable with probability function given by
𝑘𝑥, 0 ≤ 𝑥 < 2
𝑓(𝑥) = { 2𝑘, 2 ≤ 𝑥 < 4
−𝑘𝑥 + 6𝑘, 4 ≤ 𝑥 < 6
∞
Solution: ∫−∞ 𝑓(𝑥)𝑑𝑥 = 1
6
∫ 𝑓(𝑥)𝑑𝑥 = 1
0
2 4 6
∫ 𝑓(𝑥)𝑑𝑥 + ∫ 𝑓(𝑥)𝑑𝑥 + ∫ 𝑓(𝑥)𝑑𝑥 = 1
0 2 4
2 4 6
∫ 𝑘𝑥 𝑑𝑥 + ∫ 2𝑘 𝑑𝑥 + ∫ (−𝑘𝑥 + 6𝑘) 𝑑𝑥 = 1
0 2 4
2𝑘 + 4𝑘 + 2𝑘 = 1
1
𝑘=
8
2 4 6
𝐸(𝑋) = ∫ 𝑥𝑓(𝑥)𝑑𝑥 + ∫ 𝑥𝑓(𝑥)𝑑𝑥 + ∫ 𝑥𝑓(𝑥)𝑑𝑥
0 2 4
2
𝑥2 4
𝑥 6 (−𝑥 2
+ 6𝑥)
𝐸(𝑋) = ∫ 𝑑𝑥 + ∫ 𝑑𝑥 + ∫ 𝑑𝑥
0 8 2 4 4 8
𝐸(𝑋) = 3
6. Suppose that a continuous random variable X has the following probability density function
Solution:
𝑐 0 𝑐
𝑃[𝑋 < 𝑐] = ∫ 𝑓(𝑥)𝑑𝑥 = ∫ 𝑓(𝑥)𝑑𝑥 + ∫ 𝑓(𝑥)𝑑𝑥
−∞ −∞ 0
𝑐
𝑃[𝑋 < 𝑐] = ∫ 0.25𝑒 −0.25𝑥 𝑑𝑥
0
𝑃[𝑋 < 𝑐] = −(𝑒 −0.25𝑐 − 1)
0.05 = 1 − 𝑒 −0.25𝑐
𝑒 −0.25𝑐 = 0.95
−0.25𝑐 = ln (0.95)
−0.25𝑐 = −0.05129
𝑐 = 0.2051
2𝑥, 0 < 𝑥 ≤ 1
𝑓(𝑥) = {4 − 2𝑥, 1 < 𝑥 < 2
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Solution: For 𝑓(𝑥) to be a probability density function, the following two conditions are to be
satisfied.
∞
(i) 𝑓(𝑥) ≥ 0 and (ii) ∫−∞ 𝑓(𝑥)𝑑𝑥 = 1