Functions of Random Variables
Functions of Random Variables
Functions of Random Variables
X
dx
PDF of X = fX(x), PDF of Y = fY(y).
Probability in variable X is mapped to variable Y. Hence,
fY(y)dy = fX(x)dx
𝐝𝒙
fY(y) = fX(x) where x = g-1(y) (inverse function)
𝐝𝒚
𝐝𝒙
For monotonically decreasing function, i.e. is negative
𝐝𝒚
Y
X0 Probability
= fY(y)dy
dy
Probability
= fX(x)dx
X
dx
Probability in variable X is mapped to variable Y.
But probabilities must be positive, hence
|fY(y)dy| = |fX(x)dx|
𝐝𝒙
fY(y) = fX(x) where x = g-1(y)
𝐝𝒚
Example 4.1
Given Y = ln X. If fX(x) is LN(, ), find fY(y).
Y = ln X => = => =
1 1 ln x 2
Recall: f X ( x) exp LN ( , ) 0 x
x 2 2
1 1 ln(e y ) 2
fY(y) = fX(x) exp x
x 2 2
1 1 y 2
exp ~ N ( , )
2 2
Example
X = no. of functional bulldozers
Y = X2
pY(4) = pX(2) = 0.384
X Y P(X = xi)
3 9 0.512
2 4 0.384
1 1 0.096
0 0 0.008
Therefore if Y g ( X ) and X g 1 ( y ) x1 , x 2 , x 3 , , x n ,
n
pY ( y ) p X ( x i ) for discrete r.v.
i 1
n
dx i
fY ( y ) f X ( x i ) for continuous r.v.
i 1 dy
Two-valued function (non-monotonic)
Y Probability
= fY(y)dy
dy
Probability
= fX(x2)dx2 Probability
= fX(x1)dx1
X
dx2 dx1
Probability fY(y)dy is mapped to two regions:
fX(x1)dx1 and fX(x2)dx2
Hence, |fY(y)dy| = |fX(x1)dx1| + |fX(x2)dx2|
Don’t worry, 3 or
Multi-valued function (non-monotonic) more valued
function not tested
Y
Probability = fY(y)dy
Prob = dy
Prob =
fX(x3)dx3 fX(x1)dx1
X
dx3 dx2 dx1
Prob = fX(x2)dx2
n
dxi
f Y ( y ) f X ( xi ) for multi-valued functions
i 1 dy and continuous r.v.
1 2
a) x1 = + => =
1 x12 1 1 y
f Y (y ) exp exp
2π 2 2 y 2 2 πy 2
b) x2 = => =
1 x22 1 1 y
f Y (y ) exp exp
2π 2 2 y 2 2 πy 2
–1 1
Correlation coefficient
r = sample correlation coefficient
Negative correlation
straight
line
Zero correlation
Positive correlation
(uncorrelated)
straight
line
Background,
Correlation vs Dependence not tested
X and Y are independent
P(XY) = P(X)P(Y) (discrete)
fXY(x, y) = fX(x)fY(y)
joint probability density function for continuous r.v.
(background only, not tested)
X 1 ~ N ( X1 , X1 ), X 2 ~ N ( X 2 , X 2 ) with correlation X1 X 2
E [( X 1 X 1 )( X 2 X 2 )] cov ( X 1, X 2 )
cov ( X 1, X 2 )
X
1X2
X X1 2
Relationship is coincidental
Observation: Russian state leaders alternate from bald to non-bald for 200
years
Actual explanation: Purely coincidental
Littlewood's law states that a person can expect to experience events with
odds of one in a million (i.e miracle) at the rate of about one per month.
https://en.wikipedia.org/wiki/Littlewood%27s_law
Dead
Example 4.3 - Combined load on column + live
i 1 j 1
a a
i 1 j 1
i j Xi X j X X
i j
a 2 a1 X 2 X 1 X 2 X 1 a 22 a 22 X 2 X 2 X 2 X 2 a 2 a 3 X 2 X 3 X 2 X 3
a 3 a1 X 3 X 1 X 3 X 1 a 32 a 32 X 3 X 2 X 3 X 2 a 3 a 3 X 3 X 3 X 3 X 3
Note that
X X X1 1 2X2
X3X3 1 X X X
1 2 2 X1
, etc
X 1 ~ LN ( X 1 , X 1 ), X 2 ~ LN ( X 2 , X 2 ) with correlation X 1 X 2
ln Y ln a 0 a1 ln X 1 a 2 ln X 2
ln X 1 ~ N ( X 1 , X 1 ) and ln X 2 ~ N ( X 2 , X 2 )
lognormal lognormal
Y a0 X 1a1 X 2a2
Take log
ln Y ln a0 a1 ln X 1 a2 ln X 2
Calculate: E[ln Y] = Y
Stdev(lnY) = Y
Example 4.4 - Settlement of footing, S
Settlement of footing on sand, S = PBI/M
P (applied pressure): LN, = –0.005, 0.1
B (footing dimension): LN, = 1.792, = 0
sand
I (influencing factor): LN, = –0.516, 0.1
M (modulus of compressibility): LN, = 3.455, 0.15
Assume P, B, I, and M are independent LN variates,
find (a) mean settlement (b) P(S < 0.2)
Pr oduct of lognormals, hence S will be LN(S , S ), where
S P B I M 2.184
S2 2P 2B 2I 2M 0.0425 0.206
(a ) mean settlement, S exp(S 0.5S2 ) 0.115
ln 0.2 (2.184)
(b) P(S 0.2) (2.789) 0.9974
0 .206