0% found this document useful (0 votes)
10 views

Module 4

Uploaded by

gavavas318
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Module 4

Uploaded by

gavavas318
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Module 4 : Random variable, Probability distribution and Expectation.

Random variable:
A real-valued function, defined over the sample space of a random experiment is called the random
variable associated with that random experiment.
Eg: In the case of tossing two coins, the outcomes can be described as, getting, ‘0’ head, ‘1’ head, and ‘2’
heads. Then X assumes the values 0, 1, 2. Then X can be considered to be the random variable associated
with the random experiment of tossing two coins.

Discrete random variable:


A random variable is said to be discrete if it assumes only specified values in an interval.
Eg: When X takes values 1,2,3…

Continuous random variable:


A random variable is said to be continuous if it can assume any value in a given interval.
Eg: when X takes any value in a given interval (a,b).

Distribution (Probability distribution/ Probability function/ Density function) of a discrete random


variable:
Let X be a random variable assuming values x1, x2, x3,…
Let ‘x’ stands for any one of the values x1, x2, x3,…
Then the probability that the random variable ‘X’ takes the values is defined as probability function of ‘X’
and is denoted by f(x) or P(x).

Distribution function:
Let X be a random variable and ‘x’ be any value of it. Then P(X≤x) denoted by F(x) is called distribution
function.

Moments of a random variables:


Let X be a random variable with mean 𝑥̅ . Then E(x - 𝑥̅ )r is called the rth moment of X and is denoted by µr.
This is also called central moment of order r.
‫؞‬µr = E(x - 𝑥̅ )r

E(x – a)r, where ‘a’ is any arbitrary number, is called rth raw moment about ‘a’ and is denoted by µr’
‫؞‬µr’ = E(x – a)r

Note :
1. When a=0 we get µr’ about origin

Recurrence relation between Central moments and raw moments:


Let µr be the rth central moment and µr’ be the rth raw moment.
Then µr = E(x - 𝑥̅ )r = E(x - µr’)r
ie:
µr = µr’ – rC1 µ’r-1 µ1’ + rC2 µ’r – 2 (µ1’)2 … + (-1)r (µ1’)r
This is the recurrence relation. Putting r = 1, 2, 3, 4, …. And simplifying we get each of the central moments.

Uses of moments:
Moments can be used to describe important characteristics od data namely
• Mean
• Standard deviation
• Coefficient of skewness
• Measure of kurtosis

Mean = µ1’ (ie; the first moment about the origin is the mean)

SD = √µ2′ (ie; the square root of the second central moment is SD)

µ3
Coefficient of skewness =
√μ23
𝜇
Measure of kurtosis = 𝜇42
2

Joint probability:
Let X and Y be two discrete random variables. Let X assume values x1, x2,…. xm and Y assume values
y1,y2, ….. yn such that (xi , yj) be any pair of values of X and Y.
Then the probability for X = xi and Y = yj is called the joint probability and is denoted by P(xi , yj).

Joint probability function:


Joint probability function of a random variable X and Y is defined as the probability for X taking any
value of x and Y taking any value of y within in the range of values of X and Y denoted by P(x,y) of
f(x,y).
f(x,y)=P(X=x, Y=y)

Properties of joint probability function:


1. P(x, y)≥ 0 ie probability is non negative.
2. ∑𝑥 ∑𝑦 𝑃(𝑥, 𝑦) = 1 ie sum of probabilities =1.

Joint distribution function of X and Y:


Let P(x, y) be the joint probability function of X and Y ie P(x,y) = P(X=x, Y=y)
Then P(X≤ x, Y≤ y) denoted by F(x,y) is called joint probability function of X and Y.

Marginal probability function:


From the Joint probability function of X and Y, Probability function of X can be derived and it is called
the Marginal Probability function of X and is denoted by f1(x).
𝑛
ie f1(x) = P(X=x, Y takes any value) = ∑ 𝑓(𝑥, 𝑦𝑗 )
𝑗=1
Similarly we have marginal probability function of Y, denoted by f₂(y) defined as
f2(y)=P(Y=y, X takes any value) = ∑𝑚
𝑖=1 𝑓(𝑥𝑖 , 𝑦)

Conditional Probability Functions


Let f (x, y) be joint probability function of X and Y. Let f 1(x) and f₂(y) be respectively marginal
probability functions of X, Y. Then conditional probability function of X given Y is defined as
𝑓(𝑥,𝑦)
f(x/y)= 𝑓2 (𝑦)
𝑓(𝑥,𝑦)
And conditional probability function of Y given X is defined as f(y/x)= 𝑓1 (𝑦)
Independent random variables:
Two random variables X and Y are said to be independent if the product of their probability
functions is their joint probability function.
Ie; if f(x) x f(y) = f(x,y) then X and Y are said to be independent.

You might also like