Notes On Mathematical Expectation
Notes On Mathematical Expectation
= x = E ( x ) =
Example 1. A coin is biased so that a head is three times as likely to occur as a tail. Find the
expected number of tails when this coin is tossed twice.
P (T ) + P ( H ) = 1 ; P (T ) + 3P (T ) = 1 P (T ) =
1
3
and P( H ) = .
4
4
f ( x)
and = x = E ( x ) = xf ( x ) = 0.
all x
0
9
16
1
6
16
2
1
16
9
6
1
1
+ 1. + 2. = .
16
16
16 2
Example 2. If a dealers profit in units of $5000, on a new automobile can be looked upon as a
random variable X having the density function
2 (1 x ) , 0 < x < 1
f ( x) =
0,
otherwse
1
1
= E ( x ) = xf ( x ) dx = 2 x (1 x ) dx = . Thus, .5000 = $1666 is the profit per
3
3
0
0
automobile.
Sonuc Zorlu
Page 1
Lecture Notes
Find g ( X ) where g ( x ) = ( 2 x + 1) .
1
1
1
1
= g ( x ) = E g ( x ) = g ( x) f ( x ) = (2 x + 1) 2 f ( x ) = 25. + 169. + 361. = 217.33
2 6
2
3
all x
all x
g ( X )
= E g ( X ) =
g ( x ) f ( x) dx
if X is continuous.
Definition. Let X and Y be random variables with joint probability distribution f ( x, y ) .The
g( X ,Y )
= E g ( X , Y ) =
g ( x, y ) f ( x, y ) dxdy
x
2
2
Var ( X ) = X = E ( X ) =
2
( x ) f ( x ) dx, if X is continuous
The positive square root of the variance is called the standard deviation ( X ) of X .
Sonuc Zorlu
Page 2
Lecture Notes
Definition. Let X and Y be random variables with joint probability distribution f ( x, y ) . The
covariance of X and Y is
Cov ( X , Y ) = E ( X X )(Y Y )
( x X )( y Y ) f ( x, y ), X and Y are discrete
y x
=
( x X )( y Y ) f ( x, y ) dxdy, X and Y are continuous
Theorem. If X and Y are independent, then Cov( X , Y ) = 0 . The converse is not always true.
Theorem. The covariance of two random variables X and Y with means X and Y is
XY = E ( XY ) E ( X ) E (Y ) .
Definition. Let X and Y be random variables with covariance XY and standard deviations
X and Y , respectively. The correlation coefficient X and Y is
XY =
XY
.
XY
Example 4. Suppose that X and Y are independent random variables having the joint
probability distribution
x
1
2
f ( x, y )
h( y)
1
3
5
g ( x)
0.10
0.20
0.10
0.40
0.15
0.30
0.15
0.60
0.25
0.50
0.25
1
Sonuc Zorlu
Page 3
Lecture Notes
Example 5. Find the covariance of the random variables X and Y having the joint probability
density
x + y, 0 < x < 1, 0 < y < 1
f ( x, y ) =
otherwise
0,
Sonuc Zorlu
Page 4
Lecture Notes
Var ( Z ) = Var (3 X 4Y + 8)
= 9Var ( X ) + 16Var (Y ) + 2.3.(4)Cov( X , Y )
= 9 X2 + 16 Y2 24 XY
= 9.2 + 16.4 24(2)
=130
Exercises
Exercise 1. Let X denote the number of times a certain numerical control machine will
malfunction: 1,2, or 3 time on a given day. Let Y denote the number of times a technician is
called on an emergency call. Their joint probability distribution is given as
f ( x, y )
x
2
1
0.05 0.05 0.10
0.05 0.10 0.35
2
0
0.20 0.10
3
Determine the covariance between X and Y .
Exercise 2. Let X denote the number of heads andY the number of heads minus the number of
tails when 3 coins are tossed.
(a) Find the joint probability distribution of X and Y .
(b) Find the marginal distributions of X and Y .
(c) Find E ( X ), E (Y ) and E ( XY )
(d) Determine whether X and Y independent or not.
Exercise 3. Find the correlation coefficient between X and Y having the joint density function
x + y, 0 < x < 1, 0 < y < 1
f ( x, y ) =
elsewhere
0,
Sonuc Zorlu
Page 5
Lecture Notes
Exercise 4. Suppose that X and Y are independent random variables with probability densities
8
x>2
,
g ( x ) = x3
0, elsewhere
and
2 y, 0 < y < 1
h( y) =
0, elsewhere
Sonuc Zorlu
Page 6
Lecture Notes