Abahy Math Assignment Reg - No.2023104317
Abahy Math Assignment Reg - No.2023104317
Probability Distributions
Discrete and Continuous Distributions
Problem 1: Geometric Distribution->
k−1
P ( X=k )=( 1−p ) p
1
For a fair die, p= :
6
()
2
5 1 25
P ( X=3 )= × = ≈ 0.1157
6 6 216
Uniform Distribution
Discrete Uniform Distribution:-A random variable X follows a discrete
uniform distribution if it takes finite, equally spaced values with equal
probability.
1
f X ( x )= ,a≤ x≤b
b−a
For X ∼ U ( 2 ,10 ):
1 1
f X ( x )= =
10−2 8
8−4
P ( 4 ≤ X ≤ 8 )= =0.5
8
Joint Probability->
Joint Probability refers to the probability of two or more events occurring
simultaneously. If A and B are two events, their joint probability is denoted
as:
P(A∩B) or P(A,B)
Question-Two dice are rolled. Find the probability that the sum is at least 10.
( 4 , 6 ) , (5 ,5 ) , ( 5 , 6 ) , ( 6 , 4 ) , ( 6 , 5 ) , ( 6 ,6 )
6 1
P ( X ≥10 )= =
36 6
Poisson Process->
A Poisson process is characterized by a constant rate λ\lambdaλ (called
the intensity rate), which represents the average number of events per
unit time.
Question->
A factory receives 6 defective items per day on average. What is the
probability of receiving exactly 4 defective items in a day?
−λ k
( ) e λ
P X=k =
k!
For λ=6 , k =4 :
−6 4 −6
e 6 e 1296
P ( X=4 )= =
4! 24
P ( X=4 ) ≈ 0.1339
Exponential Distribution->
The Exponential Distribution is a continuous probability distribution used
to model the time between independent events that occur at a constant
average rate. It is commonly associated with waiting times.
Question->
A machine breaks down every 5 hours on average. What is the probability it
survives more than 8 hours without failure?
−λt
P ( T >t )=e
1
For λ= :
5
−8
P ( T > 8 )=e =e−1.6 ≈ 0.2019
5
Normal Approximation to Binomial->
The Normal Approximation to the Binomial is a method used to
approximate a Binomial Distribution using a Normal Distribution when
the number of trials n is large. This simplifies calculations, especially when
dealing with large values of n.
Question->
A multiple-choice exam has 50 questions with 4 choices each. A student
guesses randomly. What is the probability of scoring more than 20 correct
answers?
μ=np=50 × 0.25=12.5 , σ =√ ❑
X−μ 20−12.5
Z= = =2.45
σ 3.06
P ( Z >2.45 ) ≈ 1−0.9929=0.0071
Gamma Distribution->
The Gamma Distribution is a continuous probability distribution that
models the time until k independent events occur in a Poisson process. It
generalizes the Exponential Distribution, which models the time until the
first event occurs.
Question->
A system’s failure time follows a Gamma distribution with shape parameter
k =3 and rate λ=2. Find the expected failure time.
k 3
E [ X ] = = =1.5
λ 2
Weibull Distribution->
The Weibull Distribution is a continuous probability distribution used to
model lifetimes of objects or systems, reliability analysis, and failure
rates. It is flexible and can represent different types of failure behaviors
based on its shape parameter.
Question->
A component’s lifetime follows a Weibull distribution with shape parameter
β=2 and scale parameter η=5. Find the probability it lasts more than 6 units
of time.
E [ X ] =np
Var ( X )=np ( 1− p )
∫ x f X ( x ) dx
E [ X∨ X >a ] = a
❑
1
Since f X ( x )= for 0 ≤ X ≤2 :
2
2
∫ x × 12 dx
E [ X∨ X >1 ] = 1
❑
[ ]
2
x2
2 1
¿
1
2
( 2−0.5 )
¿ =1.5
0.5
∞
M X ( t )=E [ e ]=∫ e λ e
tX tx −λx
dx
0
∞
¿ λ∫ e
− ( λ−t ) x
dx
0
1
¿ λ× , for t< λ
λ−t
λ
M X ( t )= ,t < λ
λ−t
−λ k
e λ
P ( X=k )= , k=0 , 1 , 2, …
k!
∞
e−λ λ k
E [ X ] =∑ k
k =0 k!
E [ X ] = λ ,Var ( X )=λ
P ( X ≤ m) =0.5
−λm
1−e =0.5
Solving for m :
ln 2
m=
λ
1
If X is an exponential random variable with mean , find the distribution of
λ
2
Y=X .
Solution: The transformation method gives:
F Y ( y )=P ( Y ≤ y )=P ( X 2 ≤ y ) =P ( X ≤ √ ❑ )
−λ √❑
F Y ( y )=1−e
λ
f Y ( y )=
2 √❑
μ=np=1000 × 0.5=500 , σ =√ ❑
480−500 520−500
Z1 = =−1.26 , Z 2= =1.26
15.81 15.81
b
1
E [ X ]=∫ x
2 2
dx
a b−a
Solution:
( )
3 3 2
b −a a+b
Var ( X )= −
(
3 b−a ) 2
1
FZ (z )=
√❑
This integral does not have a closed-form solution and is computed using
numerical methods.
∞
1
M X ( t )=E [ e ]= ∫ e
tX tx
−∞
√❑
k−1
P ( X=k )=( 1−p ) p , k =1, 2 , 3 , …
∞
E [ X ] =∑ k ( 1− p )
k−1
p
k=1
1
E [ X ]=
p
1− p
Var ( X )= 2
p
α
β α−1 −βx
f X ( x )= x e , x >0
Γ (α)
Using expectation formula,
α
E [ X ]=
β
α
Var ( X )= 2
β
()
k−1
k x k
− ( x/ λ )
f X ( x )= e , x> 0
λ λ
E [ X ] = λΓ (1+1 /k )
Solution:
∞
M X ( t )=E [ e ]=∫ e λ e
tX tx −λx
dx
0
λ
M X ( t )= ,t < λ
λ−t
1
f X ( x )= ,a≤ x≤b
b−a
b
1
E [ X ] =∫ x dx
a b−a
a+ b
E [ X ]=
2
Joint Distribution
The joint distribution of two or more random variables describes the
probability that these variables take on specific values simultaneously. It
provides a complete characterization of the dependence between the
variables.
f X , Y ( x , y )={2 , 0≤ x ≤1 , 0 ≤ y ≤ x 0 , otherwise
( 1
2
1
)
Find P X ≤ , Y ≤ .
4
Solution to Problem 1:
1 /2 1 /4
( 1 1
)
P X ≤ , Y ≤ =∫ ∫ 2 dy dx .
2 4 0 0
1 /4
1 /2
∫ 12 dx= 12 x ¿1/0 2= 12 × 12 = 14 .
0
1
Thus, the required probability is ** **.
4
Solution to Problem 2:
Since the total probability must be 1, we sum over all possible values:
3 3
3 3 3 3
∑ x 1+1 = 12 + 13 + 14 = 12
6 4 3 13
+ + = .
12 12 12
x=1
3
1 1 1 1 13
∑ = + + = .
y+ 1 2 3 4 12
y=1
13 13
c× × =1.
12 12
Solving for c :
144
c= .
169
Marginal and Conditional Distributions
The marginal distribution of a subset of random variables is obtained by
summing (in the discrete case) or integrating (in the continuous case) over
the remaining variables in a joint distribution. It represents the probability
distribution of one variable irrespective of the values of the others.
Solution to Problem:
Evaluating:
x
f X ( x )=6 ( 1−x )∫ dy=6 ( 1−x ) x , 0≤ x ≤ 1.
0
1
f Y ( y )=∫ 6 (1−x ) dx .
y
Evaluating:
[ ]
1 1
x2
f Y ( y )=6∫ (1−x ) dx =6 x− .
y
2 y
[( )( )] [ ]
2 2
1 y 1 y
¿ 6 1− − y− =6 −y+ .
2 2 2 2
Thus,
2
f Y ( y )=3−6 y +3 y ,0 ≤ y ≤1.
x+ y
P ( X=x , Y = y ) = , x , y ∈ {1 ,2 , 3 }
10
Solution to Problem :
f X ,Y ( x , y )
f Y ∨X ( y∨x )= .
f X (x)
6 ( 1−x ) 1
f Y ∨X ( y∨x )= = ,0≤ y ≤x .
6 x ( 1−x ) x
Solution to Problem :
3
ρ ( X , Y )=
√❑
Solution to Problem :
Cov ( X , Y ) =E [ XY ] −E [ X ] E [ Y ] .
1 3
We compute E [ X ] =0.5 , E [ Y ] = , and E [ XY ] = .
4 20
Cov ( X , Y ) =
3 1 1
(
− × .
20 2 4 )
3 1 1
¿ − = .
20 8 40
1
Thus, Cov ( X , Y ) = .
40
Solution to Problem :
Rearranging:
P (−ln ( X ) ≤ y )=P ( X ≥ e− y ) .
¿ 1−P ( X < e− y ) .
¿ 1−F X ( e− y ) .
−y
¿ 1−e , y >0.
d
f Y ( y )= ( 1−e− y ) =e− y , y >0.
dy
Solution to Problem :
F Y ( y )=P ( Y ≤ y )=P ( X 2 ≤ y ) .
¿ P ( X ≤ √ ❑)
¿ F X ( √❑ )
Solving:
−λ √❑
F Y ( y )=1−e
Differentiating:
d
f Y ( y )= ( 1−e− λ √ ❑)
dy
− λ√❑
¿ λe
λ
¿
2 √❑
λ
f Y ( y )=
2 √❑
( )
100
∑ X i −300 ≤1.96 .
i=1
P
❑
Solution to Problem :
|n |
| X n−X|= n+1 −1 = n+ 1 .
1
Solution to Problem :
Step 1: Standardization
100
∑ X i−300 ∼ N ( 0 , 1 ) .
Z= i=1
❑
We need to compute:
P ( Z ≤1.96 ) .
P ( Z ≤1.96 )=0.975 .
Random Vector
A random vector is a collection of multiple random variables grouped
together, often used to describe multivariate probability distributions. It
extends the concept of a single random variable to higher dimensions.
Let X and Y be independent uniform random variables on (0,1). Find the joint
probability density function (PDF) of (X,Y).
Solution to Problem 1:
f X , Y ( x , y )=f X ( x ) f Y ( y ) .
Since both X and Y are uniform on ( 0 , 1 ), we have:
Otherwise, f X , Y ( x , y )=0.
Find E[X].
Solution to Problem 2:
∞
E [ X ] = ∫ x f X ( x ) dx .
−∞
1
¿ ∫ x ( 2 ( 1−x ) ) dx .
0
1
¿ 2∫ ( x−x ) dx .
2
[ ]
1
x 2 x3
¿2 − .
2 3 0
¿2
[ ] 1 1
− .
2 3
3−2 1 2 1
¿2× =2 × = = .
6 6 6 3
1
Thus, E [ X ] = .
3
Problem 12: For two random variables X and Y , the inner product is defined
as:
⟨ X , Y ⟩=E [ XY ] .
Show that if X and Y are uncorrelated, then their inner product equals
E [ X ] E [ Y ].
Solution to Problem 1:
For two random variables X and Y , the inner product is defined as:
⟨ X , Y ⟩=E [ XY ] .
Cov ( X , Y ) =E [ XY ] −E [ X ] E [ Y ] .
Cov ( X , Y ) =0.
E [ XY ] −E [ X ] E [ Y ] =0.
E [ XY ] =E [ X ] E [ Y ] .