POC Unit-1 Final

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

Outline

 Definition of a Random Variable


 Discrete Random Variables
 Continuous Random Variables
 Expectations, Variances
 Exponential Distributions
 Joint Probability Distributions
 Marginal Probability Distributions
 Covariance

1
Definition of a Random Variable
 A random variable is a real valued function defined on a sample space
S. In a particular experiment, a random variable X would be some
function that assigns a real number X(s) for each possible outcome, s S

 A discrete random variable can take a countable number of values.


 Number of steps to the top of the Eiffel Tower*
 A continuous random variable can take any value along a given
interval.
 The time a tourist stays at the top
once s/he gets there

2
Probability Distributions, Mean and Variance for Discrete
Random Variables

 The probability distribution of a discrete random variable is


defined as a function that specifies the probability associated
with each possible outcome the random variable can assume.
 p(x) ≥ 0 for all values of x
 p(x) = 1
 The mean, or expected value, of a discrete random variable is
𝜇 = 𝐸(𝑋) = ෍ 𝑥𝑝(𝑥) .

 The variance of a discrete random variable x is

𝜎 2 = 𝐸[(𝑋 − 𝜇)2 ] = ෍(𝑥 − 𝜇)2 𝑝(𝑥) .

3
The Binomial Distribution

 A Binomial Random
Variable
 n identical trials Flip a coin 3 times
 Two outcomes: Success or Outcomes are Heads or Tails
Failure
 P(S) = p; P(F) = q = 1 – p P(H) = 0.5; P(F) = 1-0.5 = 0.5
 Trials are independent A head on flip i doesn’t change
 x is the number of S’s in n P(H) of flip i + 1
trials

4
The Binomial Distribution Probability Distribution
The probability of The probability of
The number of getting the
ways of getting getting the
required number required number
the desired of successes
results of failures

 n  x n− x
P( x) =   p q
 x
 Example: Binomial tree model in option pricing.

5
Mean and Variance of Binomial Distribution

mean  = np, variance  2 = npq


𝑛 𝑚
𝑛 𝑘 𝑚!
𝐸(𝑋) = ෍ 𝑘 𝑝 (1 − 𝑝)𝑛−𝑘 = 𝑛𝑝 ෍ 𝑝 𝑠 (1 − 𝑝)𝑚−𝑠 = 𝑛𝑝 ,
𝑘 𝑠! (𝑚 − 𝑠)!
𝑘=0 𝑠=0
where 𝑚 = 𝑛 − 1 and 𝑠 = 𝑘 − 1.

𝑉𝑎𝑟(𝑋) = 𝐸(𝑋 2 ) − (𝐸(𝑋))2 ,

𝐸(𝑋 2 )
𝑛 𝑚
𝑛 𝑘 𝑚!
= ෍ 𝑘2 𝑝 (1 − 𝑝) 𝑛−𝑘 = 𝑛𝑝 ෍(𝑠 + 1) 𝑝 𝑠 (1 − 𝑝)𝑚−𝑠 = 𝑛𝑝(𝑛𝑝 − 𝑝 + 1),
𝑘 𝑠! (𝑚 − 𝑠)!
𝑘=0 𝑠=0

so, 𝑉𝑎𝑟(𝑋) = 𝑛𝑝(1 − 𝑝).

6
The Binomial Distribution Probability Distribution
 Example 2: Say 40% of the class is female.
What is the probability that 6 of the first 10 students
walking in will be female?

 n  x n− x
P( x) =  xp q
 
10  10 − 6
= 6 (.4 6
)(.6 )
 
= 210(.004096 )(.1296)
= .1115

7
The Poisson Distribution
 Evaluates the probability of a (usually small) number of occurrences
out of many opportunities in a …
 period of time, area, volume, weight, distance and other units of
measurement
e
x −
P( x) =
x!
 𝜆 = mean number of occurrences in the given unit of
time, area, volume, etc.
 Mean µ = 𝜆, variance: 𝝈2 = 𝜆
∞ ∞
𝜆𝑥 𝑒 −𝜆 𝜆𝑥−1 𝑒 −𝜆
𝐸(𝑋) = ෍ 𝑥 =𝜆෍ = 𝜆,
𝑥! (𝑥 − 1)!
𝑥=0 𝑥=1

𝑉𝑎𝑟(𝑥) = 𝜆.

8
The Poisson Distribution (Example 3)
 Example 3: Say, in a given stream there are an average of 3 striped
trouts per 100 yards. What is the probability of seeing 5 striped
trouts in the next 100 yards, assuming a Poisson distribution?

𝜆𝑥 𝑒 −𝜆 35 𝑒 −3
𝑃(𝑥 = 5) = = = 0.1008
𝑥! 5!

9
Continuous Probability Distributions

 A continuous random variable can take any numerical value within


some interval.
 A continuous distribution can be characterized by its probability
density function.
For example: for an interval (a, b],
𝑏

𝑃(𝑎 < 𝑋 ≤ 𝑏) = න fX(𝑥)𝑑𝑥.


𝑎

• The function fX(x) is called the probability density function of X. Every


p.d.f. fX(x) must satisfy

fX(𝑥) ≥ 0, 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑥, 𝑎𝑛𝑑 ‫׬‬−∞ fX(𝑥)𝑑𝑥 = 1.

10
Continuous Probability Distributions
 There are an infinite
number of possible
outcomes
 P(x) = 0
 Instead, find P(a<x≤b)

 If a random variable X has a continuous distribution for which the


p.d.f. is f(x), then the expectation E(X) and variance Var(X) are
defined as follows:

𝜇 = 𝐸(𝑋) = න 𝑥 fX(𝑥)𝑑𝑥, 𝑉𝑎𝑟(𝑋) = 𝐸[(𝑋 − 𝜇)2 ].


−∞
11
The Uniform Distribution on an Interval
1
𝑓𝑜𝑟 𝑐 ≤ 𝑥 ≤ 𝑑
fX(𝑥) = ቐ𝑑−𝑐
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

 For two values a and b


b−a
P ( a  x  b) = , cabd
d −c
 Mean and Variance d 1 c+d
= xdx = ,
c d −c 2
d 1 c + d ( d − c ) 2
2 = (x − ) 2 dx = .
c d −c 2 12
12
The Normal Distribution
 The probability density function f(x):
(𝑥−𝜇)2
1 −
fX(𝑥) = 𝑒 2𝜎2
𝜎 2𝜋

µ = the mean of x, 𝜎 = the standard deviation of x

13
The Exponential Distribution
 Probability Distribution for an Exponential Random Variable x
 Probability Density Function
1 −𝑥
𝑓𝑥 𝑥 = 𝑒 𝜃 , 𝑥 > 0
𝜃

 = Variance:  = 
2 2
 Mean:
∞ ∞
1 −𝑥 𝑥
− ∞ −
𝑥 𝑥
− ∞
𝜇 = 𝐸(𝑋) = න 𝑥 𝑒 𝜃 𝑑𝑥 = −𝑥𝑒 𝜃 |0 +න 𝑒 𝜃 𝑑𝑥 = −𝜃𝑒 𝜃 |0 = 𝜃,
𝜃
0 0

1 −𝑥
𝜎2 = 𝑉𝑎𝑟(𝑋) = න (𝑥 − 𝜃)2 𝑒 𝜃 𝑑𝑥
𝜃
0

𝑥 𝑥
2 −𝜃 ∞ −
= −(𝑥 − 𝜃) 𝑒 |0 + 2න 𝑒 (𝑥
𝜃 − 𝜃) 𝑑𝑥 = 𝜃 2 .
0

14
The Exponential Distribution (Example 5)

• Example 5: Suppose the waiting time to see the nurse at the student
health center is distributed exponentially with a mean of 45 minutes.
What is the probability that a student will wait more than an hour to get
his or her generic pill?
a

P( x  a) = e 

60

P ( x  60) = e 45

= e −1.33 = .2645

15
Joint Probability Distributions

 In general, if X and Y are two random variables, the probability


distribution that defines their simultaneous behavior is called a joint
probability distribution.
 For example: X : the length of one dimension of an injection-molded
part, and Y : the length of another dimension. We might be interested
in
 P(2.95  X  3.05 and 7.60  Y  7.80).

16
Discrete Joint Probability Distributions

 The joint probability distribution of two discrete random variables X,Y


is usually written as fXY(x,y)= Pr(X=x, Y=y). The joint probability
function satisfies
f XY ( x, y )  0 and   f XY ( x, y ) = 1.
x y
 Example 6: X can take only 1 and 3; Y can take only 1,2 and 3 ; and the
joint probability function of X and Y is:

(1) Compute P(X≥2, Y≥2)


P(X≥2, Y≥2)=P(X=3,Y=2)+P(X=3,Y=3)=0.2+0.3=0.5
(2) Compute Pr(X=3)
P(X=3)=P(X=3,Y=1)+P(X=3,Y=2)+P(X=3,Y=3)=0.2+0.2+0.3=0.7

Joint distribution of X and Y


17
Continuous Joint Distributions

 A joint probability density function for the continuous


random variables X and Y, denotes as fXY(x,y), satisfies the
following properties:

18
Continuous Joint Distributions (Example 7)
Calculating probabilities from a joint p.d.f.

cx 2 y for x 2  y  1,
f XY ( x, y ) = 
0 otherwise .
(1) c = ?
(2) Pr( X  Y ) = ?
 1 1
4 21
 f XY ( x, y ) dxdy =   cx y dxdy = c, c = .
2

−  −1 x 2
21 4
1 x
21 2 3
Pr( X  Y ) =   x y dydx = .
0 x2
4 20
19
Marginal Probability Distributions (Discrete)
Marginal Probability Distribution: the individual
probability distribution of a random variable computed
from a joint distribution.

20
Marginal Probability Distributions (Discrete, Example)

Compute fX(1), fX(3), fY(1), fY(2) and fY(3) in Example 6 .

fX(1)=P(X=1,Y=1)+P(X=1,Y=2)=0.1+0.2=0.3
fX(3)= P(X=3,Y=1)+P(X=3,Y=2)+ P(X=3,Y=3)=0.2+0.2+0.3=0.7

fY(1)= P(X=1,Y=1)+P(X=3,Y=1)=0.1+0.2=0.3
fY(2)=P(X=1,Y=2)+P(X=3,Y=2)=0.2+0.2=0.4
fY(3)= P(X=3,Y=3)=0.3

21
Marginal Probability Distributions(Continuous)
 Similar to joint discrete random variables, we can find the
marginal probability distributions of X and Y from the
joint probability distribution.

22
Marginal Probability Distributions(Continuous, Example)

Compute fX (x) and fY(y) in Example 7

 1
21 2 21 2
f X ( x) =  f XY ( x, y )dy =  x y dy = x (1 − x 4 ).
 x2
4 8
 y
21 2 7 52
fY ( y ) =  f XY ( x, y )dx =  x y dx = y .
 − y
4 2

23
Independence
• In some random experiments, knowledge of the values of
X does not change any of the probabilities associated with
the values for Y.

• If two random variables, X and Y are independent, then

Pr( X  A and Y  B) = Pr( X  A) Pr(Y  B), for any sets A


and B in the range of X and Y, respective ly.
f XY ( x, y ) = f X ( x) fY ( y ), for all x and y.

24
Covariance and Correlation Coefficient
The covariance between two RV’s X and Y is
Cov ( x, y) = E[( X − E ( X ))(Y − E (Y ))] = E ( XY ) − E ( X ) E (Y ).
Properties:

Cov ( X , a ) = 0, Cov ( X , X ) = Var ( X )


Cov ( X , Y ) = Cov (Y , X ), Cov (aX , bY ) = abCov ( X , Y )
Cov ( X + a, Y + b) = Cov ( X , Y )
Cov (aX + bY , Z ) = aCov ( X , Z ) + bCov (Y , Z ).

The correlation Coefficient of X and Y is


Cov ( X , Y )
 X ,Y =
 X Y
25

You might also like