0% found this document useful (0 votes)
79 views

CH5.Operations On Multiple Random Variables

The document discusses operations on multiple random variables, including expected value, joint moments, joint characteristic functions, and examples. It defines the expected value of a function of random variables as an integral over the joint probability density function. It also defines joint moments, central moments, covariance, and the correlation coefficient.

Uploaded by

Mohana Chandrika
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
79 views

CH5.Operations On Multiple Random Variables

The document discusses operations on multiple random variables, including expected value, joint moments, joint characteristic functions, and examples. It defines the expected value of a function of random variables as an integral over the joint probability density function. It also defines joint moments, central moments, covariance, and the correlation coefficient.

Uploaded by

Mohana Chandrika
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Philadelphia University

Department of Communication & Electronics


Engineering

Probability & Random Variables


Chapter Five

Instructor
Ibrahim N. Abu-Isbeih

Email: [email protected]
Website: www.abusbeih.com/ecourse
Ch.5: Operations on Multiple
Random Variables
1. Expected Value of a Function of Random Variables
2. Joint Moments
3. Joint Characteristic Functions

Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 2


1: Expectation
• The expected value of a function of random variables X and Y
is given by
 
g  E[ g ( X , Y )]    g ( x, y ) f X ,Y ( x, y )dxdy
 

• For N random variables X1 , X2 , ….. XN

g  E[ g ( X 1 ,  , X N )]
 
    g ( x1 ,  , xN ) f X1 ,, X N ( x1 ,  , xN )dx1    dx N
 

Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 3


2: Joint Moments
Joint Moments about the Origin:
• The joint moments of the random variables X and Y about the
origin are defined by
 
mnk  E[ X Y ]   n k
 x n y k f X ,Y ( x, y )dxdy
 

The sum n+k is called the order of the moments.

Clearly:
mn 0  E[ X n ] are the moments of X
m0 k  E[Y k ] are the moments of Y

Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 4


The first order joint moments:
m10  E[ X ]  X the mean value of X
m01  E[Y ]  Y t he mean value of Y
The second order joint moments:

m20  E[ X 2 ] the mean square value of X


m02  E[Y 2 ] the mean square value of Y
m11  E[ XY ] the correlatio n of X and Y

Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 5


The correlation of X and Y is important to later work:
 
RXY  m11  E[ XY ]    xy f X ,Y ( x, y )dxdy
 

• If RXY  E[ X ]E[Y ]
then X and Y are said to be uncorrelated.

• Statistical independence of X and Y is sufficient to guarantee


they are uncorrelated but the converse is not necessarily true
in general.

• If R XY  0
then X and Y are called orthogonal.

Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 6


Example: Let X be a random variable that has a mean value E[X]=3
and variance  X2  2 . Another random variable is defined by
Y  6 X  22
Find the mean value of Y, the variance of Y and the correlation of
X and Y.
Solution: 2 2 2
 X  E[ X ]  X
2
 E[ X ]   X  X  2  32  11
2 2

E[Y ]  E[6 X  22]  6 E[ X ]  22  4


E[Y 2 ]  E[( 6 X  22) 2 ]  E[36 X 2  264 X  484]
 36 E[ X 2 ]  264 E[ X ]  484  88
2
  E[Y ]  Y  72
2
Y
2

RXY  E[ XY ]  E[6 X 2  22 X ]  6 E[ X 2 ]  22 E[ X ]
 6(11)  22(3)  0
Since RXY  0, X and Y are orthogonal
RXY  E[ X ]E[Y ], X and Y are not uncorrelat ed
Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 7
Joint Central Moments:
• The joint central moments of the random variables X and Y
are defined by
 nk  E[( X  X ) n (Y  Y ) k ]
 
  ( x  X ) n ( y  Y ) k f X ,Y ( x, y )dxdy
 

The sum n+k is called the order of the moments.

The first order central moments:

10  E[ X  X ]  0
01  E[Y  Y ]  0

Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 8


• The second order central moments:
 20  E[( X  X ) 2 ]   X2
02  E[(Y  Y ) 2 ]   Y2
11  E[( X  X )(Y  Y )]  C XY
• The second order moment μ11 is called the covariance of X
and Y and is given by
C XY  11  E[( X  X )(Y  Y )] 
11 C
 XY
 20 02  X  Y
 RXY  E[ X ]E[Y ]
is called the correlatio n coefficien t
• If X and Y are either independent or uncorrelated then
CXY = 0
• If X and Y are orthogonal then
C XY   E[ X ]E[Y ]
Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 9
Example:
From the previous example
X 3  X2  2
Y  6 X  22
Y 4 R XY  0
Find the covariance of X and Y.
Solution:

C XY  RXY  X Y  12

Note that
C XY   X Y , because X and Y are orthogonal

Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 10


3: Joint Characteristic Functions
• The joint characteristic function of two random variables X
and Y is defined by
 X ,Y (1 , 2 )  E[e j1 X  j2Y ]
 
  e j1x  j2 y f X ,Y ( x, y )dxdy
 

The joint moments can be found as follows


 n  k  X ,Y (1 , 2 )
mnk  ( j ) n  k
1n 2k
1 0,2 0

 X (1 )   X ,Y (1 ,0) and Y (2 )   X ,Y (0, 2 )


are the marginal characteri stic functions
Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 11
Example:
Two random variables X and Y have the joint characteristic function
212 822
 X ,Y (1 , 2 )  e

Find X , Y , RXY and C XY


Solution:

1 X ,Y (1 , 2 )  212 822


X  m10  ( j ) 1 0
  j (41 )e 0
1 
1 0
2
1 0,2 0
1 0,2 0

1 X ,Y (1 , 2 )  212 822


Y  m01  ( j ) 0 1
  j (162 )e 0
1 
0 1
2
1 0,2 0
1 0,2 0

 2
 X ,Y (1 , 2 ) 2 2
RXY  m11  ( j )11   (41 )( 162 )e 21 82 0
11 12 1 0,2 0
1 0,2 0

C XY  RXY  X Y  0  X and Y are uncorrelat ed


Instructor: Ibrahim Abu-Isbeih Probability & Random Variables - CH.5 12

You might also like