0% found this document useful (0 votes)
33 views

Modul StatMat I 2020

This document outlines the assessment criteria and topics for the Mathematical Statistics I course taught by Sudarno in the Statistics Department at UNDIP. [1] Student attendance is required for a minimum of 6 out of the 7 class meetings. [2] Assessment will be based on individual assignments (15%), a midterm exam (35%), and a final exam (50%). [3] The midterm exam will cover chapters 1 through 6. [4] The first meeting will cover distributions of two random variables, including the joint cumulative distribution function and probability mass/density functions for discrete and continuous random vectors.

Uploaded by

Jari Jarno
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views

Modul StatMat I 2020

This document outlines the assessment criteria and topics for the Mathematical Statistics I course taught by Sudarno in the Statistics Department at UNDIP. [1] Student attendance is required for a minimum of 6 out of the 7 class meetings. [2] Assessment will be based on individual assignments (15%), a midterm exam (35%), and a final exam (50%). [3] The midterm exam will cover chapters 1 through 6. [4] The first meeting will cover distributions of two random variables, including the joint cumulative distribution function and probability mass/density functions for discrete and continuous random vectors.

Uploaded by

Jari Jarno
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 24

Mathematical Statistics I by Sudarno Statistics Department UNDIP

KONTRAK PERKULIAHAN

Penilaian:
 Syarat perlu: kehadiran minimal 6 dari 7 kali pertemuan
 Syarat cukup:
 Tugas Individu 15%
 Ujian Tengah Semester 35%
 Total: 50%

 Materi Ujian Tengah Semester : BAB 1 – BAB 6

1 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

MULTIVARIATE DISTRIBUTIONS

Pertemuan 1
CHAPTER I
DISTRIBUTIONS OF TWO RANDOM VARIABLES

Tujuan Pembelajaran
Setelah mempelajari bab ini, mahasiswa diharapkan mampu:
1. Mencari fungsi massa peluang marjinal dari suatu variable acak diskrit.
2. Mencari fungsi densitas peluang dari suatu variable acak kontinu.
3. Menghitung macam-macam peluang volume suatu bangun.

We begin the discussion of a pair of random variables (X 1,X2) defined on the sample space
c to the two-dimensional set d, which is a subset of two-dimensional Euclidean space R 2.
Hence (X1,X2) is a vector function from c to d. We now formulate the definition of a
random vector.

Definition 1 (Random Vector).


Given a random experiment with a sample space c, consider two random variables X1 and
X2, which assign to each element c of c one and only one ordered pair of numbers
Then we say that (X1,X2) is a random vector. The space of (X1,X2)
is the set of ordered pairs

We often denote random vectors using vector notation , where the ‘


denotes the transpose of the row vector (X1,X2). Let d be the space associated with the
random vector (X1,X2). Let event A be a subset of d. Probability of of the event A denote
by . The joint cumulative distribution function (jcdf) of (X1,X2) is given by

, for all .

Discrete type

2 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

A random vector (X1,X2) is a dicrete random vector if its space d is finite or countable.
Hence, X1 and X2 are both discrete also. The joint probability mass function (jpmf) of
(X1,X2) is defined by
for all .
As with random variables, the pmf uniquely defines the cdf. It also is characterized by the
two properties:
i.

ii. .
For an event , we have
.

Example 1.
Consider the discrete random vector (X1,X2) defined at its pmf as
Support of X2
0 1 2 3
0 0 0

Support of X1 1 0 0

2 0 0

At times it is convenient to speak of the support s of a discrete random vector


(X1,X2). These are all points in space of (X1,X2) such that The
support s = {(0,0), (0,1), (1,1), (1,2), (2,2), (2,3)}.

Continuous type
A random vector (X1,X2) with space d is of the continuous type if its cdf is
continuous, that is

, for all .

We call the integrand the joint probability density function (jpdf) of (X1,X2). A pdf is
essentially characterized by the two properties:

3 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

i.

ii. .

For an event , we have

Note that the is just the volume under the surface over
the set A.

Remark:
We often drop the subscript (X1,X2) from joint cdfs, pdfs, and pmfs, when it is clear from
the context. We also use notation such as instead of . Beside (X1,X2), we often
use (X,Y) to express random vectors.

Example 2.
Let

be the pdf of two random variables X1 and X2 of the continuous type. We have

Note that this probability is the volume under the surface above the

rectangular set .

For a continuous random vector (X1,X2), the support s of (X1,X2) contains all points
for which . The support s is subset of space d.
If a pmf or a pdf in one or more variables is explicitly defined, we can see by inspection
whether the random variables are of the continuous or discrete type. For example, it seems
obvious that

4 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

is a pmf of two discrete-type random variables X and Y, whereas

is clearly a pdf of two-continuous-type random variables X and Y.


Let (X1,X2) be a random vector. Then both X1 and X2 are random variables. We can
obtain their distributions in term of the joint distribution of (X1,X2) as follow. The event
which defined the cdf of X1 at x1 is . Taking probabilities,
we have
, for all .

First consider the discrete case. Let be the support of X1. For

By the uniqueness of cdfs, the quantity in braces must be the pmf of X 1, evaluated at w1;
that is,

, for all .

Hence, in the discrete case the marginal pmf of X1 is found by summing out x2. Similarly,
the marginal pmf of X2 is found by summing out x1.

5 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

Example 3.
Consider the discrete random vector (X1,X2) has table its pmf as follows:
Support of X2
0 1 2 3
0 0 0

Support of X1 1 0 0

2 0 0

The marginal pmf of X1 is and the marginal pmf of X2 is .

We next consider the continuous case. Let be the support of X1. For

By the uniqueness of cdfs, the quantity in braces must be the pdf of X1, evaluated at w1; that
is,

, for all .

Hence, in the continuous case the marginal pdf of X1 is found by integrating out x2.
Similarly, the marginal pdf of X2 is found by integrating out x1.

Example 4.
Let X1 and X2 have the joint pdf

The marginal pdf of X1 is

zero elsewhere,

and the marginal pdf of X2 is

zero elsewhere.

6 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

A probability like can be computed from either or because

However, to find a probability like , we must use the joint pdf


as follow:

This later probability is the volume under the surface above the set
.

Pertemuan 2
CHAPTER 2
EXPECTATION OF RANDOM VARIABLES

Tujuan Pembelajaran
Setelah mempelajari materi ini, mahasiswa diharapkan mampu:
1. Mencari ekspektasi variable acak baik diskrit maupun kontinu.
2. Menggunakan sifat-sifat aljabar ekspektasi dari variable acak.
3. Membuat fungsi pembangkit momen dari dua variable acak.

We will discuss about expectation of discrete and continuous random variables. Let (X 1,X2)
be a random vector and let Y = g(X1,X2) for some real-valued function; i.e., .
Then Y is a random variable and we could determine its expectation by obtaining the
distribution of Y.
Suppose (X1,X2) is of the continuous type. Then

Likewise if (X1,X2) is discrete, then

We can now show that E is a linear operator.

7 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

Theorem 1.
Let (X1,X2) be a random vetcor. Let and be a random
variables whose expectations exist. Then for all real number k1 and k2,
.

We also note that the expected value of any function of X2 can be found in
two ways:

The latter single integral being obtained from the double integral by integrating on x1 first.
The following example illustrates these ideas.

Example 5.
Let X1 and X2 have the joint pdf

Then

.
In addition,

Since X2 has the pdf , zero elsewhere, the latter expectation can be
found by

. Thus

Example 6.
Continuing with Example 5, suppose the random variable Y is defined by Y = X 1 / X2. We
determine E(Y) in two ways. The first way is by definition; i.e., find the distribution of Y
and then determine its expectation. The cdf of Y, for , is

8 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

Hence, the pdf of Y is


.
which leads to

For the second way to find E(Y) directly by

We next define the moment generating function of random vector.


Definition 2 (Moment Generating Function of a Random Vector).
Let be a random vector. If exists for and ,

where h1 and h2 are positive, it is denoted by and is called the moment


generating function (mgf) of X.

If it exist, the mgf of a random vector uniquely determines the distribution of the random
vector. Let Then we can write the mgf of X

Also, the mgfs of X1 and X2 are and , respectively. If there is


no confusion, we often drop the subscripts on M.

Example 7.
Let the continuous-type random variables X and Y have the joint pdf

The mgf of this joint distribution is

provided that and . Furthermore, the moment-generating functions of the


marginal distributions of X and Y are, respectively,

9 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

and

These moment-generating functions are, of course, respectively, those of the


marginal probability density function,
zero elsewhere, and

zero elsewhere.

EXERCISES I
1. If , zero elsewhere, be the pdf of X 1 and X2.

Find .
2. Let the random variables X1 and X2 have the joint pmf described as follows:
(0,0) (0,1) (0,2) (1,0) (1,1) (1,2)

and is equal to zero elsewhere.


(a) Write these probabilities in a rectangular array, recording each marginal pmf in
the “margins”.
(b) What is

3. Let X1 and X2 have the joint pdf , , zero

elsewhere. Find the marginal pdfs and compute .


4. Let X1 and X2 have the joint pdf , , zero elsewhere.
Find the marginal pdfs and compute .
5. Let X1, X2 be two random variables with the joint pmf for
zero elsewhere. Compute E(X1), E(X12), E(X2), E(X22), and
E(X1X2). Is E(X1X2) = E(X1) E(X2)? Find E(2X1 - 6X12 + 7X1X2).
6. Let X1, X2 be two random variables with joint pdf
, zero elsewhere. Compute E(X1), E(X12), E(X2), E(X22), and
E(X1X2). Is E(X1X2) = E(X1) E(X2)? Find E(3X2 - 2X12 + 6X1X2).

10 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

7. Let X1, X2 be two random variables with joint pdf for


, zero elsewhere. Determine the joint mgf of X 1, X2. Does
?

Pertemuan 3
CHAPTER 3
TRANSFORMATIONS BIVARIATE RANDOM VARIABLES

Tujuan Pembelajaran
Setelah mempelajari substansi ini, mahasiswa diharapkan mampu:
1. Melakukan operasi transformasi 1-1 variabel acak diskrit dan kontinu.
2. Mencari fungsi massa peluang bersama dari dua variable acak diskrit baru.
3. Mencari fungsi densitas peluang bersama dari dua variable acak kontinu baru.

We will discuss about transformation of bivariate random variables. Let (X 1,X2) be a


random vector. Suppose we know the joint distribution of (X 1,X2) and we seek the
distribution of a transformation of (X1,X2), say, Y = g(X1,X2). We want to obtain the pmf or
pdf of Y, by transformation. It discuss the discrete and continuous cases separately.

Discrete case
Let be the joint pmf of two discrete-type random variables X 1 and X2 with s

the (two-dimensional) set of points at which ; i.e., s is the support of

(X1,X2). Let and define a one-to-one transformation that


maps s onto t. The joint pmf of the two new random variables Y1 = u1(X1,X2) and
Y2 = u2(X1,X2) is given by

where , is the single-valued inverse of ,

. From this joint pmf we may obtain the marginal pmf of Y1


by summing on y2 or the marginal pmf of Y2 by summing on y1.

11 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

In using this change of variable technique (transformation), it should be emphasized


that we need two “new” variables (Y1,Y2) to replace the two “old” variables (X1,X2).

Example 8.
Let X1 and X2 have the joint pmf

and is zero elsewhere, where µ1 and µ2 are fixed positive real numbers. Thus the space
. We wish to find the pmf of .
If we use the change of variable technique, we need to define a second random variable Y 2.
Because Y2 is of no interest to us, let us choose it in such a way that we have a simple one-
to-one transformation. For example, take . Then and
represent a one-to-one transformation that maps S onto
.
Note that if , then . The invers functions are given by
and . Thus the joint pmf of Y1 and Y2 is

, zero elsewhere.

Consequently, the marginal pmf of Y1 is given by

and is zero elsewhere, where the third equality follows from the binomial expansion.

Continuous case
Let (X1,X2) have a joint continuous distribution with pdf and support set s.
Suppose the random variables Y1 and Y2 are given by Y1 = u1(X1,X2) and Y2 = u2(X1,X2),
where the function and define a one-to-one transformation
that maps the set s in R2 onto a (two-dimensional) set t in R2 where t is the support of
(Y1,Y2). If we express each of x1 and x2 in terms of y1 and y2, we can write

12 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

, . The Jacobian of the transformation is the determinant of order 2 given


by

It is assumed that these first-order partial derivatives are continuous and that the Jacobian J
is not identically equal to zero in t.
Let A be a subset of s, and B denote the mapping of A under the one-to-one
transformation, of course B be a subset of t. Because the transformation is one-to-one, the
events and are equivalent. Hence

Changing variables of integration by writing to


, so

x2 y2

A s t B

(0,0) x1 (0,0) y1

Figure 1. Sketch of the supports of (X1,X2), (s), and (Y1,Y2), (t).

Thus, for every set B in t,

Which implies that the joint pdf of Y1 and Y2 is

13 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

Accordingly, the marginal pdf of Y1 can be obtained from the joint pdf

in the usual manner by integrating on y2.

Example 9.
Let X1 and X2 have the joint pdf

Suppose Y1 = X1 / X2 and Y2 = X2. Hence, the inverse transformation is and


, which has the Jacobian

The inequalities defining the support s of (X1,X2) become


.
These inequalities are equivalent to
and ,
Which defines the support set t of (Y1,Y2). Hence, the joint pdf of (Y1,Y2) is

The marginal pdfs are

zero elsewhere,

and

zero elsewhere.

EXERCISES II

1. If zero elsewhere,

is the joint pmf of X1 and X2. Find the joint pmf of and .

14 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

2. Let X1 and X2 have the joint pmf and


zero elsewhere. Find first the joint pmf of and , then find the
marginal pmf of Y1.
3. Let X1 and X2 have the joint pdf , zero
elsewhere. Find the joint pdf of Y1 = 2X1 and Y2 = X2 – X1.
4. Let X1 and X2 have the joint pdf , zero elsewhere.
Find the joint pdf of Y1 = X1/X2 and Y2 = X2.
Hint: Use the inequalities in considering the mapping from s onto
t.

Pertemuan 4
CHAPTER 4
CORRELATION COEFFICIENT OF TWO RANDOM VARIABLES

Tujuan Pembelajaran
Setelah mempelajari bagian ini, mahasiswa diharapkan mampu:
1. Mencari koefisien korelasi variable acak diskrit.
2. Mencari koefisien korelasi variable acak kontinu.
In this section we will use the random variables X and Y for continuous and discrete cases.
If the means of X and Y are µ 1 and µ2, respectively, and the variances of X and Y are

and , respectively. The covariance of X and Y is number


.
If each of σ1 and σ2 is positive, the number

is called the correlation coefficient of X and Y. Therefore,


.
Example 10.
Let the random variables X and Y have the joint pdf

We next compute the correlation coefficient ρ of X and Y. Now

15 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

, and

Similarly,

and

The covariance of X and Y is

Accordingly, the correlation coefficient of X and Y is

EXERCISES III
1. Let the random variables X and Y have joint pmf

a. , (x,y) = (0,0), (1,1), (2,2), zero elsewhere.

b. , (x,y) = (0,2), (1,1), (2,0), zero elsewhere.

c. , (x,y) = (0,0), (1,1), (2,0), zero elsewhere.

In each case compute the correlation coefficient of X and Y.


2. Let X and Y have the joint pmf described as follow:
(x,y) (1,1) (1,2) (1,3) (2,1) (2,2) (2,3)

p(x,y)

and p(x,y) is equal to zero elsewhere. Find the means µ 1 and µ2, the variances

and , and the correlation coefficient .

3. Let X and Y have joint pmf , (x,y) = (0,0), (1,0), (0,1), (1,1), (2,1),

(1,2), (2,2), zero elsewhere. Find the correlation coefficient .

16 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

4. Let X1 and X2 have the joint pmf described by the following table:
(x1,x2) (0,0) (0,1) (0,2) (1,1) (1,2) (2,2)

p(x1,x2)

and p(x1,x2) is equal to zero elsewhere.


Find , µ1, µ2, , , and .

Pertemuan 5
CHAPTER 5
INDEPENDENT RANDOM VARIABLES

Tujuan Pembelajaran
Setelah mempelajari bab ini, mahasiswa diharapkan mampu:
1. Mendefinisikan pengertian variable acak saling bebas
2. Menguji variable acak saling bebas baik diskrit maupun kontinu.
3. Menggunakan fungsi pembangkit momen untuk menguji variable acak saling bebas
baik diskrit maupun kontinu.

We will discuss about definition of independent of random variables.


Definition 3 (Independence).
Let the random variables X1 and X2 have the joint pdf [joint pmf ] and
the marginal pdfs [pmfs] and , respectively. The
random variables X1 and X2 are said to be independent if, and only if,
. Random variables that are not
independent are said to be dependent.
Example 11.
Let the joint pdf of X1 and X2 be

We show that X1 and X2 are independent. Here the marginal probability density functions
are

17 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

and

Since the random variable X1 and X2 are dependent.

Theorem 2.
Let have the joint cdf and let X1 and X2 have the marginal cdfs
and , respectively. Then X1 and X2 are independent if and only if
for all .

Theorem 3.
The random variables X1 and X2 are independent random variables if and only if the
following condition holds,

For every where a,b,c and d are constants.

Example 12.
We will give negation example from above theorem that if
.
Consider the dependent variables X1 and X2 of Example 11. We have

whereas

and

Hence, , so the random variables X1 and X2 are dependent.

Theorem 4.
Suppose X1 and X2 are independent and that and exist. Then

18 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

Note 1.

Example 13.
Let X and Y be two independent random variables with means µ 1 and µ2 and positive
variances and , respectively. The independence of X and Y implies that the
correlation coefficient of X and Y is zero. Since the covariance of X and Y is equal to zero,
.

Theorem 5.
Suppose the joint mgf, , exist for the random variables X1 and X2. Then X1 and X2
are independent if and only if
;
that is, the joint mgf is identically equal to the product of the marginal mgfs.

Note 2.

Example 14.
Let (X,Y) be a pair of random variables with the joint pdf

19 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

The mgf of (X,Y) is

provided that and , then and . So

. Therefore, the random variables are dependent.

EXERCISES IV
1. Show that the random variables X1 and X2 with joint pdf
zero elsewhere are independent or
dependent?
2. If the random variables X1 and X2 have the joint pdf

zero elsewhere. Verify that X1 and X2 are dependent or


independent.
3. Let and zero elsewhere, be the joint pdf
of X1 and X2. Verify that X1 and X2 are independent or dependent and find
.
4. If the random variables X1 and X2 have the joint pdf
zero elsewhere. Are X1 and X2 independent and find

5. If zero elsewhere, is the joint pdf of the


random variables X1 and X2:
 Find M(t1,t2), M(t1,0) and M(0,t2), if .
 Are those X1 and X2 independent.
 If , show that

 Accordingly, find the mean and the variance of .

Pertemuan 6

20 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

BAB 6
LINEAR COMBINATIONS OF RANDOM VARIABLES

Tujuan Pembelajaran
Setelah mempelajari pembahasan ini, mahasiswa diharapkan mampu:
1. Mendefinisikan pengertian fungsi kombinasi linier dari variable acak.
2. Mencari kovarian dari dua fungsi kombinasi linier variable acak.
3. Mencari varian dari fungsi kombinasi linier variable acak.

Let denote a random vector from some experiment. Often we are interested in
a function of In this section, we consider linear combinations of these
variables, i.e., functions of the form

For a specified vector .

Theorem 6.

Let Provided for then .

Theorem 7.

Let and . If , and for

and then

Proof:

21 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

Corollary 1.

Let Provided , for

Note 3.
If are independent random variables, then the covariance
and Xi, Xj to be uncorrelated for all .

Corollary 2.
If are independent random variables with finite variances, then

If the random variables are independent and identically distributed (iid),


then these random variables constitute a random sample of size n from that common
distribution.

Example 15 (Sample Mean).


Let be independent and identically distributed random variables with common

mean and variance . The sample mean is defined by Then

and .

22 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

So that is an unbiased estimator of .

Example 16 (Sample Variance).


Define the sample variance by

. (*)

The fact that , so

ence, S2 is an unbiased estimator of σ2.

EXERCISES V
1. Derive the second equality in expression (*).
2. Let be four iid random variables having the same pdf
zero elsewhere. Find the mean and variance of the sum Y of
these four random variables.
3. Let and be two independent random variables so that the variances of
and are and , respectively. Given that the variance of
is 25, find k.
4. If the independent variables and have means and variances
respectively, show that the mean and variance of the product

are and , respectively.

5. Find the mean and variance of the sum , where are iid,

having pdf zero elsewhere.

6. Determine the mean and variance of the sample mean where

is a random sample from a distribution having pdf


zero elsewhere.

7. Let X and Y be random variables with .


Find the mean and variance of the random variable .
23 Last Edition
Mathematical Statistics I by Sudarno Statistics Department UNDIP

8. Let X and Y be independent random variables with means and variances


Determine the correlation coefficient of X and in terms of

,
9. Let and denote the mean and variance of the random variable X. Let
where b and c are real constants. Show that the mean and variance of Y
are, respectively, and
10. Determine the correlation coefficient of the random variables X and Y if var(X) = 4,
var(Y) = 2, and var(X+2Y) = 15.
11. Let X and Y be random variables with means and variances and
correlation coefficient . Show that the correlation coefficient of
and is .

Pertemuan 7
PREPARATION OF MIDSEMESTER TEST

Kegiatan yang dilaksanakan adalah


1. Menyelesaikan soal-jawab yang masih bermasalah.
2. Memberi informasi tentang pelaksanaan ujian tengah semester.

REFERENCES
1. Bain, L.J. and Engelhardt, M., 1991, Introduction to Probability and Mathematical
Statistics, Second Edition, Duxbury Press, Belmont California.
2. Hogg, R.V., McKean and Craig, A.T., 2013, Introduction to Mathematical Statistics,
Seventh Edition, Pearson Education, Inc., Tokyo.

24 Last Edition

You might also like