0% found this document useful (0 votes)
300 views11 pages

2.3 Mathematical Expectation: Xpxifxisad.R.Vwithp.M.F.Px X E Xfxdxifxisac.R.Vwithp.D.F.Fx

The document discusses mathematical expectation, which is a measure of the average value of a random variable. It defines expectation as the weighted average of all possible outcomes, where the weights are given by the probability of each outcome. Several important properties and theorems regarding expectation are also presented, such as: (1) the expectation of a constant times a random variable plus another constant is equal to the constant times the expectation of the random variable plus the other constant, (2) the expectation of the sum of independent random variables is equal to the sum of their individual expectations, and (3) the variance of a linear combination of random variables can be written in terms of their individual expectations and covariances.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
300 views11 pages

2.3 Mathematical Expectation: Xpxifxisad.R.Vwithp.M.F.Px X E Xfxdxifxisac.R.Vwithp.D.F.Fx

The document discusses mathematical expectation, which is a measure of the average value of a random variable. It defines expectation as the weighted average of all possible outcomes, where the weights are given by the probability of each outcome. Several important properties and theorems regarding expectation are also presented, such as: (1) the expectation of a constant times a random variable plus another constant is equal to the constant times the expectation of the random variable plus the other constant, (2) the expectation of the sum of independent random variables is equal to the sum of their individual expectations, and (3) the variance of a linear combination of random variables can be written in terms of their individual expectations and covariances.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

2.

3
Mathematical Expectation
The term expectation is used for the process of averaging when a random variable
is involved. It is the number used to locate the centre of the probability
distribution (p.m.f or p.d.f) of a random variable. A probability distribution is
described by certain satisfied measures which are computed using mathematical
expectation (or expectation)

Let be a random variable defined on a sample space . Let be a function of


such that is a random variable. Then the expected value of is
defined by

 g  x  p  x  if X is a d.r.v with p.m. f . p  x 


x
E  g  X      --------
  g  x  f  x  dx if X is a c.r.v with p.d. f . f  x 


provided these values exist.

Mean and moments:

i. Let . Then, by formula , expected value of is defined by


 x p  x  if X is a d.r.v with p.m. f . p  x 
x
E  X      
  x f  x  dx if X is a c.r.v with p.d. f . f  x 


Then is called the mean of the random variable and it is denoted by

ii. Let where is an arbitrary constant and is a non negative


integer. Then the formula gives
  x  Ar p  x  if X is a d.r.v with p.m. f . p  x 
x
E  X  A  r   
r '
   x  Ar f  x  dx if X is a c.r.v with p.d. f . f  x 

The quantity is called the moment about and it is denoted by
If then are known as Raw Moments.

iii. Let Then the formula gives


  x   r p  x  if X is a d.r.v with p.m. f . p  x 
x
E  X    r   
r
   x   r f  x  dx if X is a c.r.v with p.d. f . f  x 


The function is called the central moment of and it is denoted


by

iv. If then and it is known as the variance of the


random variable and it is denoted by or .
v. Mean and variance are important statistical measures of a
probability distribution.

Example 1: Let be a d.r.v with the p.m.f. given below:

Find and .

Solution:
Example 2: Find the expectation of the number on a die when thrown.

Solution: Let be the random variable representing the number on a die when
thrown. Then can take any one of the values each with equal
probability . Hence

Example 3: Two unbiased dice are thrown. Find the expected values of the sum
of numbers of points on them.

Solution: Define is the sum of the numbers obtained on the two dice and
and its probability distribution is given by

Example 4: In four tosses of a coin, let be the number of heads. Find the mean
and variance of .
Solution: The sample space consists of outcomes and the following
table gives the outcomes and the value of for each outcome is

S.No Out come

The p.m.f of is given in the following table:


Example 5: Find the mean and variance of the random variable , whose p.d.f is
given by

1
, 0 x2
f  x    2
0 , otherwise

Solution:

2
2 12 1  x2 
 x. f  x dx  2  x.dx  2 2  1 0  1
 
0 0  0

Mean of the random variable is .

Variance
2
 3 
2 2 1 2 2 1   x 1  1 2 1
  x 1 . f  x dx  2   x 1 .dx  2  3   2  3  3
0 0  0

Variance of the random variable is

Example 6: Find the mean of the random variable whose p.d.f. is given by

e x , x  0
f  x  
0 , otherwise
Solution:
 1 2 x   x 
       x     e x   0  1  1
 x f x dx
20 xe dx 
xe 0  e dx 0  0
0 0

Theorems on Mathematical Expectation:

The following theorems are proved by assuming that the random variables are
continuous. If the random variables are discrete, the proof remains the same
except replacing integration by summation.

Theorem 1: If is a random variable and and are constants then

Proof: Let be a c.r.v with p.d.f. Then

     
  ax  b  f  x  .dx  a  x f  x  dx  b  f  x  .dx  a E  X   b   f  x  .dx  1
     

Corollary 1: If , then

Corollary 2: If and then

Theorem 2: Addition Theorem of mathematical expectation.

If and are random variables, then provided all the


expectations exist.

Proof: Let and be continuous random variables with j.p.d.f. and


m.p.d.fs be and respectively. Then by definition,
 
 x f1  x  .dx and  y. f 2  y  .dy
 

 
Now,    x  y  f  x, y  .dx.dy
 
   
   x f  x, y  dxdy    y f  x, y  dxdy
   

     
  x   f  x, y  dy  dx   y   f  x, y  dx  dy
       

 
  x f1  x  dx   y f 2  y  dy  E  X   E Y 
 

Generalization: If are random variables, then


provided all
the expectations exist.

Theorem 3: Multiplication Theorem of mathematical Expectations

If and are independent random variables, then

Proof: Let and be continuous random variables with j.p.d.f. and


m.p.d.fs be and respectively. Then by definition,
 
 x f1  x  dx and  y f 2  y  dy
 

 
Now,    xy  f  x, y  dx dy
 

 
    x, y  f1  x  f 2  y  dx dy  X and Y are independent 
 

   
   x f1  x  dx 
 
y f 2  y dy   E  X  E Y 
    

Generalization: If are independent random variables, then


.
Theorem 4: Mathematical expectation of a linear combination of random
variables.

Let be any random variables and be any


constants. Then

provided all the expectations exist.

The proof follows using Theorem and generalization of Theorem .

Theorem 5:

Proof:

is a constant and

Note: The formula is simple to use instead of

Theorem 6: If is a random variable, and and are constants, then

Proof: Let . Then and


Corollary 1: If then variance of a constant is zero.

Corollary 2: If , then

Covariance: If and are two random variables, then the covariance between
them is defined by

Cov

Note:

1. If and are independent , then Cov


2. Cov where and are constants.
3. Cov .

Theorem 7: Variance of a linear combination of random variables.

Let be any random variables and are


constants, then

Proof:
n n n
Let U   ai X i , then  ai E  X i  and  ai  X  E  X i  
i1 i1 i1
  
n 2 n n
 i  i  i    ai a j  X i  E  X i   X j  E X j
2
a X  E X  2
i1 i1 j 1
i j

  
n 2
 ai E  X i  E  X i    2  ai a j E  X i  E  X i   X j  E X j 
2 n n

i1 i 1 j 1
 
i j

 n 
 
n n n
V   ai X i   V U    ai 2V  X i   2   ai a j cov X i ,X j
 i1  i1 i1 j 1
i j

Note:

 n  n
1. If are independent , then V   ai X i    ai 2V  X i 
 i1  i1
2. If and , then

3. If and , then

4. If and are independent , then

Example 7: The j.p.d.f. of and is given by

2  x  y ,0  x  1,0  y  1
f  x, y   
0 , otherwise
Find

i. m.p.d.fs of and
ii. c.p.d.fs of and
iii. and
iv. Covariance between and

Solutions:
1
1 1  y 2  1 3
i. f1  x    f  x, y  dy    2  x  y  dy  2 y  xy 
  2 x    x
0 0  2 
0
2 2

3
x ,0  x  1
f1  x    2
0 , otherwise

3
 y ,0  y  1
Similarly f 2  y    2
0 , otherwise

ii. ,

and ,

1 1
3  5
iii.  xf1  x  dx   x  2  x  dx  12 and
0 0  
1 1
23  1
 x f1  x  dx   x  2  x  dx  4
2
0 0  

Thus

Similarly

11 1 1 1
iv.   xy f  x, y  dxdy    xy  2  x  y  dxdy  6 (verify!)
00 0 0

You might also like