0% found this document useful (0 votes)
25 views3 pages

Unit-2 Part A

Reg 21

Uploaded by

suhagajakesava
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views3 pages

Unit-2 Part A

Reg 21

Uploaded by

suhagajakesava
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Dr.S.

Sivamani MA3391-PROBABILTY AND STATISTICS


UNIT – 2:- TWO DIMENSIONAL RANDOM VARIABLES

1. Define Two Dimensional RV.

Let S be the sample space associated with a random experiment E. Let X=X(s) & Y=Y(s) be two functions each
assigning a real number to each out comes . Then (X,Y) is called a two –dimensional r.v.

2. Define Two Dimensional Discrete RV, Two Dimensional Continuous RV

If the possible values of (X,Y) are finite or countably infinite, (X,Y) is called a two dimensional discrete r.v.

If (X,Y) can assume all values in a specified region R in the xy-plane, (X,Y) is called a two – dimensional r.v.

3.Define Probability function of (X,Y) (OR) PMF of (X,Y) (OR) Joint PMF.

If (X,Y) is a two dimensional discrete R.V. such that P(X=x i ;Y=yj)=Pij is called the PMF if it satisfies

(i) P( x, y )  0 (ii)  P( x, y)  1
y x

4. Define Probability function of (X,Y) (OR) PDF of (X,Y) (OR) Joint PDF.

If (X,Y) is a two dimensional discrete R.V. such that x   dx


2  X  x dx
2 and y  dy2  Y  y  dy
2
  f ( x, y)dxdy is
called Joint PDF satisfies (i) f ( x, y )  0 (ii ) f ( x, y)dxdy  1
R

5. What is the Stochastical Independence of Random Variables

The R.V’s X &Y are said to be independent if

(i)Discrete Case:- P( X  x  Y  y )  P( X  x).P(Y  y ) For all X,Y


(ii)Continuous Case:- f ( x, y )  g ( x).h( y )

6. write the Properties of C.D.F of 2D RV

1. F(∞,y)=0=F(x, -∞) &F(∞,∞)=0

2. P{a<X<b,Y≤y}=F(b,y)-F(a,y) 3.P{X≤x,c<Y<d}=F(x,d)-F(x,c) 4.P{a<x<b,c<y<d}=F(b,d)-F(a,d)-F(b,c)+F(a,c)

2F
5. At points of containing of f(x,y) is xy  f ( x, y )

( x  y)
7. If P( x, y )  , x=1,2 and y=1,2, then find marginal probability mass function of x and y.
12
The joint pmf is Marginal pmf are:
X 1 2
X/Y 1 2
1 2/12 3/12 P(X) 5/12 7/12
2 3/12 4/12
Y 1 2
P(Y) 5/12 7/12
Dr.S.Sivamani MA3391-PROBABILTY AND STATISTICS
8. The Joint PMF of (X,Y) is given below find the value of K P( x, y )  k (2 x  y ); x  1,2 & y  1,2 .?
X/Y 1 2
1 3K 5K
2 4K 6K
18K=1.Hence K=1/18.

9. Find the value of k, if f(x,y)=k(1-x)(1-y) in o<x,y<1 and f(x,y)=0 otherwise is to be the joint density function.
 f ( x, y)dxdy  1
R
WKT 1 1
k   (1 - x)(1 - y) dxdy  1 . K= 4.
0 0

10. Let X and Y be the continuous random variables with joint probability distribution function
(x,y) = ,0<x<2 –x<y<x and (x,y)=0 elsewhere. Find (y/x).

x( x  y )
x
x3
Marginal density function of x is g(x)=  dy 
x
8 4
f ( x, y ) x  y
(y/x).= 
g ( x) 2x 2

11. If the joint PDF of(X,Y) is given by f(x,y)= 8xy ,0<x<y<1, find Marginal density function of X.
(1  e x )(1  e y ), x  0, y  0
12. If the joint distribution function of X and Y is F(x,y)=  , find the joint pdf.
0, elsewhere
13. If f(x,y)= e-(x+y), x>0,y>0, Check whether X,Y are independent.
 kx
 ; x  0,6,12 and y  1,3,6
14. If f(x,y) =  y is the pmf, find R.
0; otherwise

15. If f(x,y)= (1/4),0≤ x,y ≤ 2, find P(X+Y <1)
16. If X,Y are independent R.V with mean 4,-2 and Variances 9,5 respectively. Find Var(2X+Y-5)
WKT Var (aX  bY)  a 2 Var ( X)  b2 Var ( Y)  2abCov( X, Y) . Since X,Y are independent, Cov(X,Y)=0.
Var(2X+Y-5)= 4*5+1*5=25
17. Define covariance.
Covariance:- A common relationship between two random variables is the Co variance. If X & Y are two R.V’s then
Covariance between them is defined as Cov( X , Y )  E ( XY )  E ( X ) E (Y ) .

If X &Y are independent E ( XY )  E ( X ) E (Y )  Cov( X , Y )  0 .

18. Prove that Cov( X , Y )  E ( XY )  E ( X ) E (Y )

Cov( X, Y )  E[( X  X )( Y  Y )]
 E[ XY  XY  XY  XY ]
 E[ XY ]  E[ XY ]  E[ XY ]  E[ XY ]
 E[ XY ]  YE[ X ]  XE[ Y ]  X Y
 E[ XY ]  Y X  XY  E( X )( E( Y )  E[ XY ]  E( X )( E( Y )
Dr.S.Sivamani MA3391-PROBABILTY AND STATISTICS
19. Prove that Cov(aX, bY )  ab Cov( X, Y )

Cov(aX, bY )  E[( aX  aX )( bY  bY )]
 abE[( X  X )( Y  Y )]  abCov( X, Y )

20.Define correlation, write its properities.

Correlation is defined as the degree of relationship between 2 or more variables.

21.Define Regression, Regression Coefficients.

Regression is a mathematical measure of the average relationship between two or more variables in terms of the
original limits of the data.

y x
Regression Coefficients:- b yx  r and bxy  r
x y

Lines of Regression:-

1. The Line of Regression of Y on X is Y  Y  byx ( X  X )

2. The Line of Regression of X on Y is X  X  bxy (Y  Y )

22.Properties of Regression Lines:-


1. The regression lines passes through ( X , Y ), which is the point of intersection.

2. The regression lines are coincide when r  1 .

3. When r=0 the lines X  X & Y  Y represent perpendicular lines which are parallel to the axis.

y 1y
4. The slopes of the regression lines are r and
x r x

5. The correlation coefficient is the Geometric mean of two regression lines.

r   bxy b yx

6. b xy  1  b yx  1

23. Write the Angle between the regression lines


x y  1  r 2 
tanθ   
 2x   2y  r 
24.The random variables are related by 4x-5y+33=0 and 20x-9y=107.Find mean of x, mean of y.
WKT The regression lines passes through ( X , Y ), which is the point of intersection.

Solving given equation we get, X  13, Y  17


25. State central Limit theorem.
If X1,X2,..Xn is a sequence of n independent and identically distributed random variables with same mean μ and standard
deviation σ, then the variable has a distribution that approaches the

standard normal distribution as n tends to infinity provided the m.g.f of Xi exist.

You might also like