Recap: Random Variables
Recap: Random Variables
Recap: Random Variables
Random variables
1 lecture 3
Mean / Expected value
Definition
Definition:
Let X be a random variable with probability /
Density function f(x). The mean or expected value of X
is give by
μ = E( X) = ∑ x f ( x )
x
if X is discrete, and
∞
μ = E( X) = ∫ x f ( x )dx
−∞
if X is continuous.
2 lecture 3
Mean / Expected value
Interpretation
Interpretation:
The total contribution of a value multiplied by the probability
of the value – a weighted average.
Example: Mean value= 1,5
f(x)
0.4
0.3
0.2
0.1
0 1 2 3 x
3 lecture 3
Mean / Expected value
Example
Problem:
• A private pilot wishes to insure his plane valued at 1 mill kr.
• The insurance company expects a loss with the following
probabilities:
• Total loss with probability 0.001
• 50% loss with probability 0.01
• 25% loss with probability 0.1
Theorem:
Let X be a random variable with probability / density
function f(x). The expected value of g(X) is
μg( X ) = E [g( X)] = ∑ g( x ) f ( x )
x
if X is discrete, and
∞
if X is continuous.
5 lecture 3
Expected value
Linear combination
E(aX+b) = aE(X)+b
6 lecture 3
Mean / Expected value
Example
Problem:
• The pilot from before buys a new plane valued at 2 mill kr.
• The insurance company’s expected losses are unchanged:
• Total loss with probability 0.001
• 50% loss with probability 0.01
• 25% loss with probability 0.1
7 lecture 3
Mean / Expected value
Function of a random variables
Definition:
Let X and Y be random variables with joint probability /
density function f(x,y). The expected value of g(X,Y) is
8 lecture 3
Mean / Expected value
Function of two random variables
Problem:
Burger King sells both via “drive-in” and “walk-in”.
Let X and Y be the fractions of the opening hours that “drive-in” and “walk-
in” are busy.
Assume that the joint density for X and Y are given by
0 ≤ x≤ 1, 0≤y≤1
f(x,y) = { 4xy
0 otherwise
9 lecture 3
Mean / Expected value
Sums and products
Theorem: Sum/Product
Let X and Y be random variables then
E[X+Y] = E[X] + E[Y]
10 lecture 3
Variance
Definition
Definition:
Let X be a random variable with probability / density
function f(x) and expected value µ. The variance of X is
then given
σ 2 = Var ( X) = E [( X − μ)2 ] = ∑ ( x − μ)2 f ( x )
x
if X is discrete, and
∞
σ = Var ( X) = E [( X − μ) ] = ∫ ( x − μ)2 f ( x ) dx
2 2
−∞
if X is continuous.
f(x)
Varians = 0.5 f(x)
Varians = 2
0.5 0.5
0.4 0.4
0.3 0.3
0.2 0.2
0.1 0.1
1 2 3 x 0 1 2 3 4 x
12 lecture 3
Variance
Linear combinations
Examples:
Var (X + 7) = Var (X)
Var (-X ) = Var (X)
Var ( 2X ) = 4 Var (X)
13 lecture 3
Covariance
Definition
Definition:
Let X and Y be to random variables with joint probability /
density function f(x,y). The covariance between X and Y is
σ XY = Cov( X, Y ) = E[( X − μX )( Y − μY )] = ∑ ∑ ( x − μX )( y − μY ) f ( x, y )
x y
σ XY = Cov( X, Y ) = ∫ ∫ ( x − μX )( y − μY ) f ( x, y ) dx dy
−∞ −∞
14 lecture 3
Covariance
Interpretation
15 lecture 3
Covariance
Properties
Theorem:
The covariance between two random variables X and Y
with means µX and µY, respectively, is
σ XY = Cov( X, Y ) = E [ X Y ] − μX μY
16 lecture 3
Variance/Covariace
Linear combinations
It holds that − 1 ≤ ρ XY ≤ 1
18 lecture 3
Mean, variance, covariace
Collection of rules
19 lecture 3