Unit 8
Unit 8
Structure
8.1 Introduction
Objectives
8.1 INTRODUCTION
In Units 1 to 4 of this course, you have studied probabilities of different events
in various situations. Concept of univariable random variable has been
introduced in Unit 5 whereas that of bivariate random variable in Units 6 and
7. Before studying the present unit, we advice you to go through the above
units.
You have studied the methods of finding mean, variance and other measures in
context of frequency distributions in MST-002 (Descriptive Statistics). Here, in
this unit we will discuss mean, variance and other measures in context of
probability distributions of random variables. Mean or Average value of a
random variable taken over all its possible values is called the expected value
or the expectation of the random variable. In the present unit, we discuss the
expectations of random variables and their properties.
In Secs. 8.2, 8.3 and 8.4, we deal with expectation and its properties. Addition
and multiplication laws of expectation have been discussed in Sec. 8.5.
Objectives
After studying this unit, you would be able to:
find the expected values of random variables;
establish the properties of expectation;
obtain various measures for probability distributions; and
apply laws of addition and multiplication of expectation at appropriate
situations.
55
Random Variables and n
Expectation
f x
i 1
i i
Mean = n
.
fi
i 1
f x
i 1
i i
f1x1 f 2 x 2 ... f n x n
Mean = n
= n
fi
i 1
f i 1
i
x1f1 x 2 f2 xn fn
= n
n
... n
fi
i 1
fi
i 1
f
i 1
i
f f f
= x1 n i x2 n 2 ... x n n n
f f f
i i i
i 1 i 1 i1
f1 f2 fn
Notice that n
, n
,..., n
are, in fact, the relative frequencies or the
f f
i 1
i
i 1
i f
i 1
i
56
n
Mathematical Expectation
x f
i 1
i i
Mean of a frequency distribution of X is n
, similarly mean of a
fi
i 1
n
x p
i 1
i i
probability distribution of r.v. X is n
.
p
i 1
i
n
Now, as we know that p
i 1
i 1 for a probability distribution, therefore
n
the mean of the probability distribution becomes x i pi .
i 1
n
Expected value of a random variable X is E X x i pi .
i 1
The above formula for finding the expected value of a random variable X
is used only if X is a discrete random variable which takes the values
x1 , x 2 , ..., x n with probability mass function
p x i P X x i , i 1, 2,..., n.
= x1p1 x 2 p 2 x 3p 3
57
Random Variables and
1 1 1 1 1
Expectation = 0 1 (2) = 0 1
4 2 4 2 2
So, we get the same answer, i.e. 1 using the formula also.
So, expectation of a random variable is nothing but the average (mean)
taken over all the possible values of the random variable or it is the value
which we get on an average when a random experiment is performed
repeatedly.
Remark 1: Sometimes summations and integrals as considered in the
above definitions may not be convergent and hence expectations in such
cases do not exist. But we will deal only those summations (series) and
integrals which are convergent as the topic regarding checking the
convergence of series or integrals is out of the scope of this course. You
need not to bother as to whether the series or integral is convergent or
not, i.e. as to whether the expectation exists or not as we are dealing with
only those expectations which exist.
Example 1: If it rains, a rain coat dealer can earn Rs 500 per day. If it is a dry
day, he can lose Rs 100 per day. What is his expectation, if the probability of
rain is 0.4?
Solution: Let X be the amount earned on a day by the dealer. Therefore, X can
take the values Rs 500, Rs 100 ( loss of Rs 100 is equivalent to negative of
the earning of Rs100).
Probability distribution of X is given as
Rainy Day Dry day
X in Rs. : 500 100
px : 0.4 0.6
Thus, his expectation is Rs 140, i.e. on an overage he earns Rs 140 per day.
1 2 1
P 2 heads , P one head , P no head .
4 4 4
Let X be the amount in rupees won by him
X can take the values 5, 2 and 1 with
58
1 Mathematical Expectation
P X 5 P 2heads ,
4
2
P X 2 P 1Head , and
4
1
P X 1 P no Head .
4
Probability distribution of X is
X: 5 2 1
1 2 1
px
4 4 4
Expected value of X is given as
3
E X x i pi = x1p1 x 2 p 2 x 3p 3
i 1
1 2 1 5 4 1 10
= 5 2 1 = 2.5.
4 4 4 4 4 4 4
Thus, the expected value of amount won by him is Rs 2.5.
Example 3: Find the expectation of the number on an unbiased die when thrown.
Solution: Let X be a random variable representing the number on a die when thrown.
X can take the values 1, 2, 3, 4, 5, 6 with
1
P X 1 P X 2 P X 3 P X 4 P X 5 P X 6 .
6
Thus, the probability distribution of X is given by
X: 1 2 3 4 5 6
1 1 1 1 1 1
px :
6 6 6 6 6 6
Hence, the expectation of number on the die when thrown is
6
1 1 1 1 1 1 21 7
E X x i pi 1 2 3 4 5 6 =
i 1 6 6 6 6 6 6 6 2
Example 4: Two cards are drawn successively with replacement from a
well shuffled pack of 52 cards. Find the expected value for the number of
aces.
Solution: Let A1, A2 be the events of getting ace in first and second draws,
respectively. Let X be the number of aces drawn. Thus, X can take the
values 0, 1, 2 with
P X 0 P no ace P A1 A 2
59
Random Variables and 48 48 12 12 144
Expectation = = ,
52 52 13 13 169
P X 1 one Ace and one other card
P A1 A 2 A1 A 2
4 4 1
P A1 P A 2 = .
52 52 169
3 16 16 3 16
1
4 3 4 4 12
Now, you can try the following exercises.
E1) You toss a fair coin. If the outcome is head, you win Rs 100; if the
outcome is tail, you win nothing. What is the expected amount won
by you?
60
E2) A fair coin is tossed until a tail appears. What is the expectation of Mathematical Expectation
number of tosses?
E3) The distribution of a continuous random variable X is defined by
x3 , 0 x 1
3
f x 2 x , 1 x 2
0 , elsewhere
Obtain the expected value of X.
= k pi
i
= k x i pi
i
k E X
3. E a X b ax i b p i [By def.]
i
= ax p
i
i i bp i ax i p i bpi a x i pi b pi
i i i i
aE X b 1 aE X b
61
Random Variables and Continuous Case:
Expectation
Let X be continuous random variable having f(x) as its probability density
function. Thus,
1. E k kf x dx [By def.]
k f x dx
k xf x dx kE X
3. E aX b ax b f x dx ax f x dx b f x dx
a x f x dx b f x dx aE X b 1 = aE X b
Find i) E(X)
ii) E(2X + 3)
iii) E(X2)
iv) E(4X – 5)
Solution
5
i) E X x i pi = x1p1 x 2 p 2 x 3 p3 x 4 p 4 x 5p 5
i 1
Let us now express the moments and other measures for a random variable
in terms of expectations in the following section.
f i 1
i
So, the rth order moment about any point ‘A’ of a random variable X
having probability mass function P X x i p x i pi is defined as
n
r
p x i i A
r' i 1
n
p i 1
i
Variance
Variance of a random variable X is second order central moment and is
defined as
2 2
2 = V X E X E X E X
Also, we know that
2
V X 2 ' 1 '
64
Mathematical Expectation
2
a 2 E X E X [Using property 2 of section 8.3]
Cor. (i) V aX a 2 V X
(ii) V(b) = 0
(iii) V(X + b) = V(X)
f xi i x y i y
Cov X, Y i
f
i
i
where p i j = P X x i , Y y j
65
Random Variables and 2
Expectation Proof: V X Y E X Y E X Y
2
E X Y E X E Y
2
E X E X Y E Y
2 2
E X E X Y E Y 2 X E X Y E Y
2 2
E X E X E Y E Y 2E X E X Y E Y
V X V Y 2Cov X, Y
V X V Y
V aX bY a 2 V X b 2 V Y .
f
i 1
i xi x
n
, and
f i 1
i
p i x mean n
i 1
n
p i x mean
p
i 1
i
i 1
66
pi x Mean for discrete r.v Mathematical Expectation
x Mean f x dx for continuous r.v
Note: Other measures as defined for frequency distributions in MST-002
can be defined for probability distributions also and hence can be
expressed in terms of the expectations in the manner as the moments;
variance and covariance have been defined in this section of the Unit.
Example 7: Considering the probability distribution given in Example 6, obtain
i) V(X)
ii) V(2X + 3).
Solution:
2
(i) V X E X 2 E X
Solution: V(3X + 4Y) = (3)2 V(X) + (4)2 V(Y) [By Remark 3 of Section 8.4]
= 9(2) + 16( 3) = 18 + 48 = 66
67
Random Variables and
Expectation
Addition Theorem of Expectation
Theorem 8.2: If X and Y are random variables, then E X Y E X E Y
Proof:
Discrete case:
Let (X, Y) be a discrete two-dimensional random variable which takes up
the values (xi, yj) with the joint probability mass function
pij = P X x i Y y j .
p ij
j
p j p y j P Y y j = p ij
i
E X x i pi , E Y y jp j and E X Y x i y j p ij
i j i j
Now E X Y x i y j p ij
i j
x i pij y jpij
i j i j
= x p y p
i
i
j
ij
j
j
i
ij
[ in the first term of the right hand side, xi is free from j and hence can
be taken outside the summation over j; and in second term of the right
hand side, yj is free from i and hence can be taken outside the summation
over i.]
E X Y x i pi y j p j = E X E Y
i j
Continuous Case:
Let (X, Y) be a bivariate continuous random variable with probability
density function f x, y . Let f x and f y be the marginal
probability density functions of random variables X and Y respectively.
68
Mathematical Expectation
E X x f x dx, E Y y f y dy,
and E X Y x y f x, y dy dx .
Now, E X Y x y f x, y dy dx
x f x, y dy dx y f x, y dy dx
x
f x, y dy dx y f x, y dx dy
[ in the first term of R.H.S., x is free from the integral w.r.t. y and
hence can be taken outside this integral. Similarly, in the second term of
R.H.S, y is free from the integral w.r.t. x and hence can be taken outside
this integral.]
Refer to the definition of marginal density
x f x dx y f y dy function given in Unit 7 of this course
E X E Y
Remark 3: The result can be similarly extended for more than two random
variables.
Multiplication Theorem of Expectation
Theorem 8.3: If X and Y are independent random variables, then
E(XY) = E(X) E(Y)
Proof:
Discrete Case:
Let (X, Y) be a two-dimensional discrete random variable which takes up the
values x i , y j with the joint probability mass function
pij P X x i Y y j . Let pi and p j' be the marginal probability mass
functions of X and Y respectively.
E X x i pi , E Y y jp j' , and
i j
E XY x i y j p ij
i j
69
Random Variables and
Expectation if events A and B are independent,
= P X x i P Y y j
then P A B P A P B
p i p j'
Hence, E(XY) = x y p p
i j
i j i j
'
= x i y jpi p j'
i j
x i p i y j p j'
i j
and E XY xy f x, y dy dx .
Now E XY xy f x, y dy dx
X and Y are independent, f(x,y)=f(x)f(y)
xy f x f y dy dx
(see Unit 7 of this course)
x f x yf y dy dx
x f x dx y f y dy
E X E Y
Remark 4: The result can be similarly extended for more than two
random variables.
Example 8: Two unbiased dice are thrown. Find the expected value of
the sum of number of points on them.
Solution: Let X be the number obtained on the first die and Y be the
number obtained on the second die, then
70
7 7 Mathematical Expectation
E X and E Y [See Example 3 given in Section 8.2]
2 2
The required expected value = E(X + Y)
Using addition theorem
= E(X) + E(Y)
of expectation
7 7
= =7
2 2
Remark 5: This example can also be done considering one random
variable only as follows:
Let X be the random variable denoting “the sum of numbers of points on
the dice”, then the probability distribution in this case is
X: 2 3 4 5 6 7 8 9 10 11 12
1 2 3 4 5 6 5 4 3 2 1
p(x) :
36 36 36 36 36 36 36 36 36 36 36
1 2 1
and hence E(X) = 2 3 ... 12 =7
36 36 36
Example 9: Two cards are drawn one by one with replacement from 8
cards numbered from 1 to 8. Find the expectation of the product of the
numbers on the drawn cards.
Solution: Let X be the number on the first card and Y be the number on
the second card. Then probability distribution of X is
X 1 2 3 4 5 6 7 8
px 1 1 1 1 1 1 1 1
8 8 8 8 8 8 8 8
p y 1 1 1 1 1 1 1 1
8 8 8 8 8 8 8 8
1 1 1
E X E Y 1 2 ... 8
8 8 8
1 1 9
1 2 3 4 5 6 7 8 36
8 8 2
Thus, the required expected value is
E XY E X E Y [Using multiplication theorem of expectation]
9 9 81
.
2 2 4
71
Random Variables and
Expectation
Expectation of Linear Combination of Random Variables
Theorem 8.4: Let X1 , X 2 , ..., X n be any n random variables and if
a1 , a 2 , ..., a n are any n constants, then
[Note : Here a1X1 a 2 X 2 ... a n X n is a linear combination of X1, X2, ... , Xn]
= a1E X1 a 2 E X 2 ... a n E X n .
Now before ending this unit, let’s summarize what we have covered in it.
8.6 SUMMARY
The following main points have been covered in this unit:
1) Expected value of a random variable X is defined as
n
E X x i pi , if X is a discrete random variable
i 1
xf x dx , if X is a continuous random variable.
72
vi) If X1 , X 2 , ..., X n be any n random variables and if a1 , a 2 , ..., a n are any n Mathematical Expectation
constants, then
E a1X1 a 2 X 2 ... a n X n a1E X1 a 2 E X 2 ... a n E X n .
pi x i A r , if X is a discrete r.v.
i
'
r
x A r f x dx, if X is a continous r.v
= E(X A)r
ii) Variance of a random variable X is given as
2 2
V X E X E X E X
E X E X Y E Y
8.7 SOLUTIONS/ANSWERS
E1) Let X be the amount (in rupees) won by you.
1
X can take the values 100, 0 with P[X = 100] = P[Head] = , and
2
1
P[X = 0] = P[Tail] = .
2
probability distribution of X is
X: 100 0
1 1
px
2 2
73
Random Variables and and hence the expected amount won by you is
Expectation
1 1
E X 100 0 = 50.
2 2
E2) Let X be the number of tosses till tail turns up.
X can take values 1, 2, 3, 4… with
1
P[X = 1] = P[Tail in the first toss] =
2
2
1 1 1
P[X = 2] = P[Head in the first and tail in the second toss] = ,
2 2 2
3
1 1 1 1
P[X = 3] = P[HHT] = , and so on.
2 2 2 2
Probability distribution of X is
X: 1 2 3 4 5...
2 3 4 5
1 1 1 1 1
px ...
2 2 2 2 2
and hence
2 3 4
1 1 1 1
E X 1 2 3 4 ... … (1)
2 2 2 2
1
Multiplying both sides by , we get
2
2 3 4 5
1 1 1 1 1
E X 2 3 4 ...
2 2 2 2 2
2 3 4
1 1 1 1
E X 2 3 ... … (2)
2 2 2 2
[Shifting the position one step towards right so that we get the
terms having same power at the same positions as that in (1)]
Now, subtracting (2) from (1), we have
2 3 4
1 1 1 1 1
E X E X ...
2 2 2 2 2
2 3 4
1 1 1 1 1
E X
2 2 2 2 2
2 3
1 1 1
E X 1 ...
2 2 2
74
(Which is an infinite G.P. with first term a = 1 and common ratio Mathematical Expectation
1
r= )
2
1 a
[ S (see Unit 3of course MST - 001)]
1 1 r
1
2
1
= 2.
1
2
E3) E X x f x dx
0 1 2
x f x dx x f x dx x f x dx x f x dx
0 1 2
0 1 2
3
x 0 dx x x dx x 2 x
3
dx x 0 dx
0 1 2
1 2
0 x 4 dx x 8 x 3 6x 2 x dx 0
0 1
1 2
x 4dx 8x x 4 12x 2 6x 3 dx
0 1
1 2
x5 x 2 x5 x3 x4
8 12 6
5 0 2 5 3 4 1
5 2 5 3 4 2 5 3 4
1 32 1 3
16 32 24 4 4
5 5 5 2
1 8 13 1 3 1
.
5 5 10 5 10 2
E4) As X is a random variable with mean ,
E(X) = ... (1)
75
Random Variables and
X
Expectation Now, E Z E
1
E X
1
E X [Using Property 2 of Sec. 8.3]
1
E X [Using Property 3 of Sec. 8.3]
1
[Using (1)]
=0
Note: Mean of standard random variable is zero.
X
E5) Variance of standard random variable Z = is given as
X X
V(Z) = V =V
1
V X
2
1 Using the result of the Theorem 8.1
V X of Sec. 8.5 of this unit
1
= VX
2
1 it is given that the standard deviation
2
2 = 1 2
of X is and hence its variance is
Note: The mean of standard random variate is ‘0’ [See (E4)] and its
variance is 1.
E6) Given that E Y 0 E(a X b) = 0 a E(X) – b = 0
a(10) – b = 0
10 a – b = 0 ... (1)
Also as V(Y) = 1,
hence V(aX b) = 1
1
a2V(X) = 1 a2(25) = 1 a2 =
25
1
a= [ a is positive]
5
From (1), we have
1
10 b 0 2 – b = 0 b = 2
5
76
1 Mathematical Expectation
Hence, a = , b = 2.
5
E7) Let X be the number on the first card and Y be the number on the
second card. Then probability distribution of X is:
X 1 2 3 4 5 6 7 8 9 10
px 1 1 1 1 1 1 1 1 1 1
10 10 10 10 10 10 10 10 10 10
px 1 1 1 1 1 1 1 1 1 1
10 10 10 10 10 10 10 10 10 10
1 1 1
E(X) = E(Y) = 1 2 ... 10
10 10 10
1 1
1 2 3 4 5 6 7 8 9 10 = 55 5.5
10 10
and hence the required expected value is
E X Y E X E Y = 5.5 + 5.5 = 11
E8) Let X be the number obtained on the first die and Y be the number
obtained on the second die.
7
Then E(X) = E(Y) = . [See Example 3 given in Section 8.2]
2
Hence, the required expected value is
E XY E X E Y [Using multiplication theorem of expectation]
7 7 49
= = .
2 2 4
77