Supportive Notes & QB-Distribution Theory-PS-Unit2
Supportive Notes & QB-Distribution Theory-PS-Unit2
Properties:
1. For real numbers a1, a2, b1, b2
P (a1 < X ≤ b1, a2 < Y ≤ b2) = F (b1, b2) + F (a1, a2) - F (a1, b2) - F (b1, a2)
2. 0 ≤ F (x, y) ≤ 1 ∀ (x, y)
i.e. If a1 < a2 and b1 < b1,, then F (b1, a2) ≥ F (a1, a2) and F (a1, b2) ≥ F (a1, a2)
µr = E(X-µ)r = ∑∞
0 (𝑥 − µ)rp(x)
∞
=∫−∞ (𝑥 − µ)𝑟 𝑓(𝑥)ⅆ𝑥
µr = E(X-µ)r
𝑟
= E{∑𝑟𝑘=0 (−1)𝑘 ( ) (𝜇1′ )𝑘 (𝑋)𝑟−𝑘 }
𝑘
𝑟
= ∑𝑟𝑘=0 ′
(−1)𝑘 ( ) (𝜇1′ )𝑘 (𝜇𝑟−𝑘 )
𝑘
µ1 = 0
µ2 = 𝜇′2 − (𝜇1′ )2
MX(t) = E (etX) = ∑∞
0 𝑒 𝑡𝑥 𝑝(𝑥), if X is discrete r.v. with pmf p(x)
∞
=∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥)ⅆ𝑥, if X is continuous r.v. with pdf f(x)
Definition: The moment generating function (mgf) of random variable X (about
origin) is denoted by MX(t) and is defined as MX(t) = E(etX), provided the RHS exist
for values of t in some interval -h < t < h, h € R, thus
MX(t) = E (etX) = ∑∞
0 𝑒 𝑡𝑥 𝑝(𝑥), if X is discrete r.v. with pmf p(x)
∞
=∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥)ⅆ𝑥, if X is continuous r.v. with pdf f(x)
MY(t) = e-at/hMX(t/h)
iv) Uniqueness theorem:
The moment generating function of a distribution, if it exists, uniquely determine the
distribution. In other words, corresponding to a given probability distribution, there is
only one mgf (provided exists) and corresponding to a given mgf, there is only one
probability distribution.
Joint Moment Generating Function.
The joint moment generating function of bivariate distribution denoted by
M (t1, t2) is defined as
M (t1, t2) = MX, Y (t1, t2) = E [ 𝑒 𝑡1𝑋+𝑡2𝑌 ]
= ∑𝑥 ∑𝑦 𝑒 𝑡1𝑥+𝑡2𝑦 p (x, y), If X and Y are discrete
random variables with joint pmf p(x,y)
Transformations:
One Dimensional transformation.
Let X be continuous random variable with pdf f(x).
Consider a transformation Y= g(X). Let y= g(x) be one to one transformation between
the values of x and y.
Then the pdf of is given by f(y) = f [w(y)] |J| where y =g(x) => x= w(y) and
𝜕𝜘
J= is called Jacobian of transformation.
𝜕𝑦
Two- Dimensional Transformation.
Let X and Y be two continuous random variables with joint pdf f (x, y).
Consider a transformation U = g1(X, Y) and V= g2(X, Y), let u= g1(x, y) and
v= g2(x, y) be one to one transformation between the pairs of values (x, y) and (u, v),
so that equations u= g1(x, y) and v= g2(x, y) gives unique solution for x and y in terms
of u and v. Let x=w1(u, v) and y=w2(u, v). Then the joint pdf U and V is given by
f(u, v) = f{w1(u, v), w2(u, v)}|J|, where J =|𝜕𝜘 𝜕𝑥 𝜕𝑦 𝜕𝑦
𝜕𝑢 𝜕𝑣 𝜕𝑢 𝜕𝑣
| is called Jacobian of
transformation.
Convolutions:
Let X and Y be non-negative independent integral-valued random variables with
probability P(X=j) = aj and P(Y=j)= bj . The event (X=j, Y=k) has probability ajbk.
The sum Z=X+Y is new random variable, and the event Z=r is the union of the
mutually exclusive events
(X=0,Y=r), (X=1,Y=r-1),…..,(X=r,Y=0) . Therefore the distribution cr = P(Z=r) is
given by cr = a0br + a1br-1 + a2br-2 +………….+ ar-1b1 + arb0 (1)
The operation (1), leading from two sequences {ak} and {bk} to a new sequence {ck}
is called the convolution of {ak} and {bk}.
Definition: Let {ak} and {bk} be any two numerical sequences (not necessarily
probability distributions). The new sequence {ck} defined as
ck = a0bk + a1bk-1 + a2bk-2 +………….+ ak-1b1 + akb0 and will be denoted by
{ck} ={ak}*{bk}
Examples: i) If ak = 1, bk =1 for all k ≥ 0, then ck= k +1.
ii) If ak = k, bk =1 for all k ≥ 0, then ck= k(k +1)/2
Now A(s)B(s) = ∑∞
0 𝑎𝑖 𝑠 𝑖 ∑∞
0 𝑏𝑗 𝑠 𝑗
……………………………………………………………………………………..
Question Bank- Unit 2
4) Let X~ P(m) and Y/x ~ Binomial (x, p). Find prob. mass function of Y.
8) Prove theorem: If X and Y are ant two random variables, then prove that
i) E(Y) = E{E(Y/X)}
ii) V(Y) = E{V(Y/X)} + V{E(Y/X)}
9) Let f (x, y) = e-y 0 < x < y < ∞, be the joint pdf of X and Y.
Obtain joint mgf of X and Y. Find correlation i.e. ρ(X_,Y)
11) Let f (x, y, z) = e-(x+y+z) 0 < x< ∞; 0 < y< ∞; 0 < z< ∞;
be the joint pdf of X and Y. Obtain pdf of U= ( X+Y +Z)/3