0% found this document useful (0 votes)
22 views24 pages

Abahy Math Assignment Reg - No.2023104317

The document covers various probability distributions including Geometric, Uniform, Joint Probability, Poisson, Exponential, Normal, Gamma, Weibull, and their respective properties such as expectation and variance. It provides problems and solutions related to these distributions, illustrating their applications in real-world scenarios. Additionally, it discusses concepts like moment generating functions and the Central Limit Theorem.

Uploaded by

abhaykumarbnbn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views24 pages

Abahy Math Assignment Reg - No.2023104317

The document covers various probability distributions including Geometric, Uniform, Joint Probability, Poisson, Exponential, Normal, Gamma, Weibull, and their respective properties such as expectation and variance. It provides problems and solutions related to these distributions, illustrating their applications in real-world scenarios. Additionally, it discusses concepts like moment generating functions and the Central Limit Theorem.

Uploaded by

abhaykumarbnbn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 24

Random Variables and

Probability Distributions
Discrete and Continuous Distributions
Problem 1: Geometric Distribution->

The Geometric Distribution is a discrete probability distribution


that models the number of trials needed to get the first success in a
sequence of independent Bernoulli trials (trials with two outcomes:
success or failure).
A fair die is rolled until a six appears. Find the probability that a six appears
on the third roll.

Solution: The Geometric distribution is given by:

k−1
P ( X=k )=( 1−p ) p

1
For a fair die, p= :
6

()
2
5 1 25
P ( X=3 )= × = ≈ 0.1157
6 6 216

Uniform Distribution
Discrete Uniform Distribution:-A random variable X follows a discrete
uniform distribution if it takes finite, equally spaced values with equal
probability.

Problem : A random variable X is uniformly distributed over [ 2 ,10 ]. Find


P ( 4 ≤ X ≤ 8 ).
Solution: The PDF of a uniform distribution is:

1
f X ( x )= ,a≤ x≤b
b−a

For X ∼ U ( 2 ,10 ):

1 1
f X ( x )= =
10−2 8

8−4
P ( 4 ≤ X ≤ 8 )= =0.5
8

Joint Probability->
Joint Probability refers to the probability of two or more events occurring
simultaneously. If A and B are two events, their joint probability is denoted
as:
P(A∩B) or P(A,B)

Question-Two dice are rolled. Find the probability that the sum is at least 10.

Solution: Possible sums ≥ 10:

( 4 , 6 ) , (5 ,5 ) , ( 5 , 6 ) , ( 6 , 4 ) , ( 6 , 5 ) , ( 6 ,6 )

Total cases: 6, Total outcomes: 6 × 6=36

6 1
P ( X ≥10 )= =
36 6

Poisson Process->
A Poisson process is characterized by a constant rate λ\lambdaλ (called
the intensity rate), which represents the average number of events per
unit time.
Question->
A factory receives 6 defective items per day on average. What is the
probability of receiving exactly 4 defective items in a day?

Solution: Poisson formula:

−λ k
( ) e λ
P X=k =
k!

For λ=6 , k =4 :
−6 4 −6
e 6 e 1296
P ( X=4 )= =
4! 24

Using e−6 ≈ 0.0025:

P ( X=4 ) ≈ 0.1339

Exponential Distribution->
The Exponential Distribution is a continuous probability distribution used
to model the time between independent events that occur at a constant
average rate. It is commonly associated with waiting times.

Question->
A machine breaks down every 5 hours on average. What is the probability it
survives more than 8 hours without failure?

Solution: Exponential survival function:

−λt
P ( T >t )=e

1
For λ= :
5
−8
P ( T > 8 )=e =e−1.6 ≈ 0.2019
5
Normal Approximation to Binomial->
The Normal Approximation to the Binomial is a method used to
approximate a Binomial Distribution using a Normal Distribution when
the number of trials n is large. This simplifies calculations, especially when
dealing with large values of n.

Question->
A multiple-choice exam has 50 questions with 4 choices each. A student
guesses randomly. What is the probability of scoring more than 20 correct
answers?

Solution: Approximate using Normal Distribution:

μ=np=50 × 0.25=12.5 , σ =√ ❑

Convert to standard normal variable:

X−μ 20−12.5
Z= = =2.45
σ 3.06

Using normal table:

P ( Z >2.45 ) ≈ 1−0.9929=0.0071

Gamma Distribution->
The Gamma Distribution is a continuous probability distribution that
models the time until k independent events occur in a Poisson process. It
generalizes the Exponential Distribution, which models the time until the
first event occurs.

Question->
A system’s failure time follows a Gamma distribution with shape parameter
k =3 and rate λ=2. Find the expected failure time.

Solution: The expectation of a Gamma distributed variable is:

k 3
E [ X ] = = =1.5
λ 2
Weibull Distribution->
The Weibull Distribution is a continuous probability distribution used to
model lifetimes of objects or systems, reliability analysis, and failure
rates. It is flexible and can represent different types of failure behaviors
based on its shape parameter.

Question->
A component’s lifetime follows a Weibull distribution with shape parameter
β=2 and scale parameter η=5. Find the probability it lasts more than 6 units
of time.

Solution: Weibull survival function:


β
− ( t / η)
P ( T >t )=e
2

P ( T >6 )=e−( 6/ 5) =e−1.44 ≈ 0.2369

Problem 9: Expectation of a Binomial Distribution


Find the expectation of a binomially distributed random variable X ∼ Bin ( n , p ).

Solution: The expectation of a binomial random variable is given by:

E [ X ] =np

Problem 10: Variance of a Binomial Distribution

Find the variance of a binomially distributed random variable X ∼ Bin ( n , p ).

Solution: The variance of a binomial random variable is given by:

Var ( X )=np ( 1− p )

Problem 11: Conditional Expectation

Let X be a uniform random variable on [ 0 , 2 ]. Find E [ X∨ X >1 ] .

Solution: Conditional expectation formula:


b

∫ x f X ( x ) dx
E [ X∨ X >a ] = a

1
Since f X ( x )= for 0 ≤ X ≤2 :
2
2

∫ x × 12 dx
E [ X∨ X >1 ] = 1

[ ]
2
x2
2 1
¿
1
2

( 2−0.5 )
¿ =1.5
0.5

Problem 12: Moment Generating Function of Exponential Distribution

Find the moment generating function (MGF) of an exponential random


variable with parameter λ .

Solution: The MGF is defined as:


M X ( t )=E [ e ]=∫ e λ e
tX tx −λx
dx
0


¿ λ∫ e
− ( λ−t ) x
dx
0

1
¿ λ× , for t< λ
λ−t

λ
M X ( t )= ,t < λ
λ−t

Problem 13: Mean and Variance of Poisson Distribution


Show that the mean and variance of a Poisson-distributed random variable
with parameter λ are both equal to λ .

Solution: The Poisson distribution is given by:

−λ k
e λ
P ( X=k )= , k=0 , 1 , 2, …
k!

Using the expectation formula:


e−λ λ k
E [ X ] =∑ k
k =0 k!

Differentiating the generating function of Poisson distribution and solving, we


get:

E [ X ] = λ ,Var ( X )=λ

Problem 14: Exponential Distribution Median

Find the median of an exponential distribution with parameter λ .

Solution: The median m satisfies:

P ( X ≤ m) =0.5

Using the CDF:

−λm
1−e =0.5

Solving for m :

ln 2
m=
λ

Problem 15: Transformation of Random Variables

1
If X is an exponential random variable with mean , find the distribution of
λ
2
Y=X .
Solution: The transformation method gives:

F Y ( y )=P ( Y ≤ y )=P ( X 2 ≤ y ) =P ( X ≤ √ ❑ )

Using the exponential CDF:

−λ √❑
F Y ( y )=1−e

Differentiating, the PDF is:

λ
f Y ( y )=
2 √❑

Problem 16: Central Limit Theorem

A fair coin is tossed 1000 times. Approximate the probability of getting


between 480 and 520 heads.

Solution: Using normal approximation:

μ=np=1000 × 0.5=500 , σ =√ ❑

Converting to standard normal variable:

480−500 520−500
Z1 = =−1.26 , Z 2= =1.26
15.81 15.81

Using normal tables:

P (−1.26< Z< 1.26 ) ≈ 0.793

Problem 17: Expected Value of a Uniform Distribution

Let X be uniformly distributed over [ a , b ]. Find E [ X 2 ] .

Solution: Using expectation formula:

b
1
E [ X ]=∫ x
2 2
dx
a b−a

Evaluating the integral:


[ ]
b
1 x3 b 3−a 3
E [ X ]=
2
=
b−a 3 a 3 ( b−a )

Problem 18: Variance of a Uniform Distribution

Find Var ( X ) for a uniform distribution U ( a , b ).

Solution:

Var ( X )=E [ X 2 ]−( E [ X ])


2

Using previous results:

( )
3 3 2
b −a a+b
Var ( X )= −
(
3 b−a ) 2

Problem 19: CDF of a Normal Distribution

Find the cumulative distribution function (CDF) of a standard normal


variable.

Solution: The standard normal CDF is given by:

1
FZ (z )=
√❑

This integral does not have a closed-form solution and is computed using
numerical methods.

Problem 20: Moment Generating Function of Normal Distribution

the moment generating function (MGF) of a normal random variable


X ∼ N ( μ , σ 2 ).

Solution: The MGF is defined as:


1
M X ( t )=E [ e ]= ∫ e
tX tx

−∞
√❑

Completing the square and solving the Gaussian integral,


1 2 2
μt + σ t
2
M X ( t )=e

Problem 21: Expected Value of a Geometric Distribution

Find the expected value of a geometric random variable with parameter p.

Solution: The geometric distribution is given by:

k−1
P ( X=k )=( 1−p ) p , k =1, 2 , 3 , …

Using the expectation formula:


E [ X ] =∑ k ( 1− p )
k−1
p
k=1

Using summation formulas,

1
E [ X ]=
p

Problem 22: Variance of a Geometric Distribution

Find the variance of a geometric random variable with parameter p.

Solution: The variance is given by:

Var ( X )=E [ X 2 ]−( E [ X ])


2

Using the moment formula,

1− p
Var ( X )= 2
p

Problem 23: Mean of a Gamma Distribution

Find the mean of a gamma-distributed random variable X ∼ Gamma ( α , β ).

Solution: The gamma PDF is given by:

α
β α−1 −βx
f X ( x )= x e , x >0
Γ (α)
Using expectation formula,

α
E [ X ]=
β

Problem 24: Variance of a Gamma Distribution

Find the variance of a gamma-distributed random variable X ∼ Gamma ( α , β ).

Solution: Using moment properties,

α
Var ( X )= 2
β

Problem 25: Mean of a Weibull Distribution

Find the mean of a Weibull-distributed random variable X ∼ Weibull ( k , λ ).

Solution: The Weibull distribution has PDF:

()
k−1
k x k
− ( x/ λ )
f X ( x )= e , x> 0
λ λ

Using the gamma function,

E [ X ] = λΓ (1+1 /k )

Problem 26: Variance of a Weibull Distribution

Find the variance of a Weibull-distributed random variable X ∼ Weibull ( k , λ ).

Solution:

Var ( X )= λ2 [ Γ ( 1+ 2/k )− ( Γ ( 1+ 1/k ) ) ]


2

Problem 27: Sum of Independent Normal Variables

If X 1 ∼ N ( μ1 , σ 21 ) and X 2 ∼ N ( μ2 , σ 22 ) are independent, find the distribution of


Y = X 1 + X 2.

Solution: The sum of independent normal variables is also normal:


Y ∼ N ( μ1 + μ2 , σ 21+ σ 22 )

Problem 28: Moment Generating Function of Exponential Distribution

Find the moment generating function (MGF) of an exponential random


variable X ∼ exp ( λ ).

Solution: The MGF is given by:


M X ( t )=E [ e ]=∫ e λ e
tX tx −λx
dx
0

Solving the integral,

λ
M X ( t )= ,t < λ
λ−t

Problem 29: Probability Density Function of Uniform Distribution

Find the probability density function (PDF) of a uniform random variable


X ∼ U ( a , b ).

Solution: The PDF is given by:

1
f X ( x )= ,a≤ x≤b
b−a

Problem 30: Expected Value of Uniform Distribution

Find the expected value of a uniform random variable X ∼ U ( a , b ).

Solution: Using the expectation formula,

b
1
E [ X ] =∫ x dx
a b−a

Solving the integral,

a+ b
E [ X ]=
2
Joint Distribution
The joint distribution of two or more random variables describes the
probability that these variables take on specific values simultaneously. It
provides a complete characterization of the dependence between the
variables.

Problem 1: Let X and Y be continuous random variables with joint PDF:

f X , Y ( x , y )={2 , 0≤ x ≤1 , 0 ≤ y ≤ x 0 , otherwise

( 1
2
1
)
Find P X ≤ , Y ≤ .
4

Solution to Problem 1:

The probability is given by

1 /2 1 /4

( 1 1
)
P X ≤ , Y ≤ =∫ ∫ 2 dy dx .
2 4 0 0

Evaluating the inner integral:

1 /4

∫ 2 dy=2 y ¿ 1/0 4=2 × 14 = 12 .


0

Now, integrating with respect to x :

1 /2

∫ 12 dx= 12 x ¿1/0 2= 12 × 12 = 14 .
0

1
Thus, the required probability is ** **.
4

Problem 2: Let X and Y be discrete random variables with joint PMF:


P ( X=x , Y = y ) =c ( x +1 ) ( y +1 )

for x , y ∈{1 ,2 , 3 }. Find the constant c .

Solution to Problem 2:

Since the total probability must be 1, we sum over all possible values:

3 3

∑ ∑ ( x +1 )c( y+1 ) =1.


x=1 y=1

Evaluating the sum:

3 3 3 3

∑ ∑ ( x +1 )c( y+1 ) =c ∑ x 1+1 ∑ y+1


1
.
x=1 y=1 x=1 y=1

Calculating individual summations:

∑ x 1+1 = 12 + 13 + 14 = 12
6 4 3 13
+ + = .
12 12 12
x=1

3
1 1 1 1 13
∑ = + + = .
y+ 1 2 3 4 12
y=1

Thus, the equation becomes:

13 13
c× × =1.
12 12

Solving for c :

144
c= .
169
Marginal and Conditional Distributions
The marginal distribution of a subset of random variables is obtained by
summing (in the discrete case) or integrating (in the continuous case) over
the remaining variables in a joint distribution. It represents the probability
distribution of one variable irrespective of the values of the others.

Problem 3: Given the joint PDF:

f X , Y ( x , y )=2, 0< x < y <1 ,

find the marginal PDF of X and compute E [ X ] .

Solution to Problem:

The marginal PDF of X is obtained by integrating out y :


x
f X ( x )=∫ 6 ( 1−x ) dy .
0

Evaluating:

x
f X ( x )=6 ( 1−x )∫ dy=6 ( 1−x ) x , 0≤ x ≤ 1.
0

The marginal PDF of Y is obtained by integrating out x :

1
f Y ( y )=∫ 6 (1−x ) dx .
y

Evaluating:

[ ]
1 1
x2
f Y ( y )=6∫ (1−x ) dx =6 x− .
y
2 y

[( )( )] [ ]
2 2
1 y 1 y
¿ 6 1− − y− =6 −y+ .
2 2 2 2
Thus,

2
f Y ( y )=3−6 y +3 y ,0 ≤ y ≤1.

Problem 4: Given the joint PMF:

x+ y
P ( X=x , Y = y ) = , x , y ∈ {1 ,2 , 3 }
10

Find the conditional PMF P ( Y ∨X ).

Solution to Problem :

The conditional PDF is given by:

f X ,Y ( x , y )
f Y ∨X ( y∨x )= .
f X (x)

Using the previously found f X ( x )=6 x (1−x ):

6 ( 1−x ) 1
f Y ∨X ( y∨x )= = ,0≤ y ≤x .
6 x ( 1−x ) x

Covariance and Correlation


Covariance measures the degree to which two random variables
change together. It indicates whether an increase in one variable
corresponds to an increase or decrease in another variable.

Problem 5: Given X and Y with E [ X ] =2 , E [ Y ] =3, E [ XY ] =8 , compute the


covariance and correlation coefficient of X and Y .

Solution to Problem :

The correlation coefficient is given by:


Cov ( X , Y )
ρ ( X , Y )=
√❑

Substituting the given values:

3
ρ ( X , Y )=
√❑

Thus, the correlation coefficient is **0.5 **.

Problem 6: If X and Y are independent random variables with variances


Var ( X )=4 and Var ( Y )=9 , compute Cov ( X , Y ) .

Solution to Problem :

The covariance is given by:

Cov ( X , Y ) =E [ XY ] −E [ X ] E [ Y ] .

1 3
We compute E [ X ] =0.5 , E [ Y ] = , and E [ XY ] = .
4 20

Cov ( X , Y ) =
3 1 1
(
− × .
20 2 4 )
3 1 1
¿ − = .
20 8 40

1
Thus, Cov ( X , Y ) = .
40

Transformation of Random Variables


The transformation of random variables is a fundamental concept in
probability theory where new random variables are created by applying a
function to one or more existing random variables. This technique is useful
for deriving the distributions of new random variables resulting from known
ones.

Problem 7: If X follows a uniform distribution on ( 0 , 1 ), find the distribution of


Y =−lnX

Solution to Problem :

We use the transformation method. The cumulative distribution function


(CDF) of X is:

F X ( x )=P ( X ≤ x )=x , 0< x <1.

Since Y =−ln ( X ) , we find the CDF of Y :

F Y ( y )=P ( Y ≤ y )=P (−ln ( X ) ≤ y ) .

Rearranging:

P (−ln ( X ) ≤ y )=P ( X ≥ e− y ) .

¿ 1−P ( X < e− y ) .

¿ 1−F X ( e− y ) .

−y
¿ 1−e , y >0.

Differentiating F Y ( y ) gives the PDF:

d
f Y ( y )= ( 1−e− y ) =e− y , y >0.
dy

Thus, Y follows an **exponential distribution** with parameter λ=1.

Problem 8: Let X be an exponential random variable with mean λ−1. Find


the distribution of Y = X 2.

Solution to Problem :

The PDF of X is:


− λx
f X ( x )=λ e , x >0.

The transformation is Y = X 2, so we find its CDF:

F Y ( y )=P ( Y ≤ y )=P ( X 2 ≤ y ) .

¿ P ( X ≤ √ ❑)

¿ F X ( √❑ )

Solving:

−λ √❑
F Y ( y )=1−e

Differentiating:

d
f Y ( y )= ( 1−e− λ √ ❑)
dy

− λ√❑
¿ λe

λ
¿
2 √❑

Thus, the PDF of Y is:

λ
f Y ( y )=
2 √❑

Convergence and Central Limit Theorem.


The Central Limit Theorem (CLT) is one of the most fundamental results
in probability theory. It states that the sum (or average) of a large number of
independent and identically distributed (i.i.d.) random variables tends toward
a normal (Gaussian) distribution, regardless of the original distribution.
Problem 9: Let X 1 , X 2 , … , X n be i.i.d. random variables with mean μ=3 and
variance σ 2=4 . Using the Central Limit Theorem, approximate:

( )
100

∑ X i −300 ≤1.96 .
i=1
P

Solution to Problem :

A sequence of random variables X n converges in probability to X if:

∀ ϵ >0 , lim P (|X n −X|≥ ϵ )=0.


n →∞

Step 1: Compute | X n−X|

|n |
| X n−X|= n+1 −1 = n+ 1 .
1

Step 2: Probability Condition

For any ϵ >0 ,

P (| X n−1|≥ ϵ )=P ( n+1 1 ≥ ϵ ) .


1
Since is always positive, solving gives:
n+1

P (| X n−1|≥ ϵ )=0 for sufficiently large n .

Thus, X n **converges in probability** to X =1.

Problem 10: Show that if X n → X in probability, then E [ X n ] → E [ X ] if X n is


uniformly integrable.

Solution to Problem :

The Central Limit Theorem states:


n

∑ X i−nμ ≈ N ( 0 , 1 ) for large n .


i=1

Step 1: Standardization

100

∑ X i−300 ∼ N ( 0 , 1 ) .
Z= i=1

We need to compute:

P ( Z ≤1.96 ) .

Step 2: Normal Probability

From standard normal tables:

P ( Z ≤1.96 )=0.975 .

Thus, the approximate probability is **0.975**.

Random Vector
A random vector is a collection of multiple random variables grouped
together, often used to describe multivariate probability distributions. It
extends the concept of a single random variable to higher dimensions.

Problem 1: Joint Distribution of a Random Vector

Let X and Y be independent uniform random variables on (0,1). Find the joint
probability density function (PDF) of (X,Y).

Solution to Problem 1:

Since X and Y are independent, the joint PDF is given by:

f X , Y ( x , y )=f X ( x ) f Y ( y ) .
Since both X and Y are uniform on ( 0 , 1 ), we have:

f X ( x )=1 , 0< x< 1.

f Y ( y )=1 , 0< y <1.

Thus, the joint PDF is:

f X , Y ( x , y )=1×1=1 ,0 < x< 1, 0< y< 1.

Otherwise, f X , Y ( x , y )=0.

Problem 2: Expectation of a Random Vector

Let (X,Y) have the joint PDF:

f X , Y ( x , y )={2 , 0< x < y <1 0 , otherwise

Find E[X].

Solution to Problem 2:

Expectation is given by:


E [ X ] = ∫ x f X ( x ) dx .
−∞

Step 1: Find the Marginal PDF of X


1
f X ( x )=∫ 2 dy .
x

¿ 2 ( 1−x ) ,0< x< 1.

Step 2: Compute Expectation


1
E [ X ] =∫ x f X ( x ) dx .
0

1
¿ ∫ x ( 2 ( 1−x ) ) dx .
0

1
¿ 2∫ ( x−x ) dx .
2

[ ]
1
x 2 x3
¿2 − .
2 3 0

¿2
[ ] 1 1
− .
2 3

3−2 1 2 1
¿2× =2 × = = .
6 6 6 3

1
Thus, E [ X ] = .
3

Vector Space Interpretation of Random


Variables
The vector space interpretation of random variables provides a
geometric perspective on probability theory by treating random variables as
elements of a function space. This approach is fundamental in statistics,
signal processing, and machine learning.

Problem 12: For two random variables X and Y , the inner product is defined
as:

⟨ X , Y ⟩=E [ XY ] .

Show that if X and Y are uncorrelated, then their inner product equals
E [ X ] E [ Y ].
Solution to Problem 1:

For two random variables X and Y , the inner product is defined as:

⟨ X , Y ⟩=E [ XY ] .

Step 1: Use Covariance Formula

Cov ( X , Y ) =E [ XY ] −E [ X ] E [ Y ] .

Since X and Y are uncorrelated:

Cov ( X , Y ) =0.

E [ XY ] −E [ X ] E [ Y ] =0.

E [ XY ] =E [ X ] E [ Y ] .

Thus, their inner product equals E [ X ] E [ Y ].

You might also like