0% found this document useful (0 votes)
19 views

Changing Random Variables Tijms Understanding Probability

The document discusses joint probability distributions for discrete and continuous random variables. For discrete random variables X and Y, their joint probability mass function p(x,y) gives the probability that X takes on value x and Y takes on value y. For continuous random variables, their joint probability density function f(x,y) describes the relative likelihood of the pair of values (x,y). The marginal densities are obtained by summing or integrating the joint density over one of the variables. Transformations of random variables preserve probabilities and allow changing from one joint distribution to another.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

Changing Random Variables Tijms Understanding Probability

The document discusses joint probability distributions for discrete and continuous random variables. For discrete random variables X and Y, their joint probability mass function p(x,y) gives the probability that X takes on value x and Y takes on value y. For continuous random variables, their joint probability density function f(x,y) describes the relative likelihood of the pair of values (x,y). The marginal densities are obtained by summing or integrating the joint density over one of the variables. Transformations of random variables preserve probabilities and allow changing from one joint distribution to another.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

p.

323- 325 Transformation or random variables


p. 325 Linear Transform V= X+Y W=X-Y

Understanding Probability
Chance Rules in Everyday Life

Secom.I Edition

HENK TIJMS
VrUe Uni\•ersily

~CAMBRIDGE
~~UNIVERSITY PRESS
CAllHIDOB UNIVHllTY PHii
CamtJridae. New York. Melboume, Mldrid. Olpe 1lJwn, S........ Slo Pllllo
C.mbliclae Uniwlllity Pnm
The SdiabarP Builcliaa. Cllllbridp C82 IRU, Ult
Publilbed in lhe Uaited Stlltll of Arwica by c.mbridp Unhenity Pnm. New YOltc

www.cllllbridaurl
lnfonmliua ua dlil tide: www~21701723

CH.11jmD17

11ds publicldcm ii in CGp1riallt. Subject to ......... , acepdoa


..... to ... provilliolll of . . . . collec:liw .................
DO lllNCldul:dua of 1111'1 pld may tUe pllce widloat
lhe wriuen permlaiaa of Cllllbridp Uniwlllity ......

Fint pablWled D11

Printed in lhe United~ M lhe Uniwllity Pnm. C1111bddae

A calllloi m:onlfor dlb pllbUcodolt u "'1tlHalM /if1lll tlw Brltidl Ubrar,

ISBN 978-0-521-70172-3 ~

Cambridae Unhenity Pnm hll ao l'llpUlllibilit for lhe pel'lilfeace or m y of URLa


for external or dlinl-pldy inw weblitllll mfllred to in Ilda pablicadua, llld does not
....-. tblt_,.caarmtua IUCla weblitllll ii, or will Nllllin, 1CCU111eor1ppwpdlfe..
Contents

Preface page ix

Introduction

PART ONE: PROBABILITY IN ACTION 9


Probability questions 11

The law of large numbers and simulation 17


The law of large numbers for probabilities 18
Basic probability concepts 27
Expected value and the law of large numbers 32
The drunkard's walk 37 ....,;,'
The St. Petersburg pamdox 39
Roulette and the law of large numbers 41
The Kelly betting system 44
Random-number genemtor 50
Simulating from probability distributions 55
0 Problems 64

Probabilities in everyday life 73


The birthday problem 74
The coupon collector's problem 79
Craps 82
Gambling systems for roulette 86
The 1970 draft lottery 89
Bootstrap method 93
Problems 95

v
vi Ct1111<'111.T

4 Rare events and lotteries ·-M>3


4.1 The binomial distribution 104
4.2 The Poi"500 dil'tribution IOK
4.3 The hype1Jeometric distribution 12.~
4.4 Pn>blems 134

5 Probability and statistics 141


5.1 The normal curve 143
5.2 The &."tmc."Cpl of standanl deviation 151
5.3 The square-mot law 159
5.4 The central limit theorem 160
5.5 Graphical illu!'lration of the central limit
theorem 164
5.6 Stati!'lical applications 166
S.1 Confidence intervals for simulations 170
S.8 The central limit theorem and randtlm walks 177
5.9 Falsified data and Benfont•s law 191
5.10 The normal di!'lribution strikes apin 196
S.11 StatiKlics and probability theury 197
5.12 Pmbletme 2tK)

6 a.nee m. and Bayes' nale 206


6.1 The Monty Hall dilemma 207
6.2 The test paradox 212
6.3 Pn>blems 217

PART TWO: F..S..~ENTIALS OF PROBABll..ITY 221


7 Foundations of probability theory 223
7.1 Probabilistic foundations 223
7.2 Compound chance experiments 231
7.3 Some basic nales 235

8 Condltloul probability and Bayes 243


8.1 Conditional probability 243
8.2 Bayeit' nale in odds fonn 2.~I
8.3 Bayesian Ktatistics 256

9 Basic rules for discrete nadom ftl'lables 263


9.1 Random variables 26.1
Contents vii

9.2 Expected value 264


9.3 Expected value of sums of random variables 268
9.4 Substitution rule and variance 270
9.5 Independence of random variables 275
9.6 Special discrete distributions 279

10 Continuous random variables 284


10.1 Concept of probability density 285
10.2 Important probability densities 296
10.3 Transfonnation of random variables 308
10.4 Failure rate function 310

11 Jointly distributed random variables 313


11.1 Joint probability densities 313
11.2 Marginal probability densities 319
11.3 Transfonnation of random variables 323
11.4 Covariance and correlation coefficient 327

12 Multivariate normal distribution 331


12.1 Bivariate nonnal distribution 331
12.2 Multivariate normal distribution 339
12.3 Multidimensional central limit theorem 342
12.4 The chi-square test 348

13 Conditional distributions 352


13.1 Conditional probability densities 352
13.2 Law of conditional probabilities 356
13.3 Law of conditional expectations 361

14 Generating functions 367


14.1 Generating functions 367
14.2 Moment-generating functions 374

15 Markov chains 385


15.1 Markov model 386
15.2 Transient analysis of Markov chains 394
15.3 Absorbing Markov chains 398
15.4 Long-run analysis of Markov chains 404
A~ndix Ctmntilltt mrth11d.1 and r 415
R«YHnmrndrd mulillR 421
A11.1M'f'n to odd-n111nhrl'Pd pmbl~m.1 422
Bihlioamphy 437
/11dt'.'( 439
11
Jointly distributed random variables

In experiments, one is often interested not only in individual random variables,


but also in relationships between two or more random variables. For example,
if the experiment is the testing of a new medicine, the researcher might be
interested in cholesterol level, blood pressure, and the glucose level of a test
person. Similarly, a political scientist investigating the behavior of voters might
be interested in the income and level of education of a voter. There are many
more examples in the physical sciences, medical sciences, and social sciences. In
applications, one often wishes to make inferences about one random variable on
the basis of observations of other random variables. The purpose of this chapter
is to familiarize the student with the notations and the techniques relating to
experiments whose outcomes are described by two or more real numbers. The
discussion is restricted to the case of pairs of random variables. Extending the
notations and techniques to collections of more than two random variables is
straightforward.

11.1 Joint probability densities


It is helpful to discuss the joint probability mass function of two discrete random
variables before discussing the concept of the joint density of two continuous
random variables. In fact, Section 9.3 has dealt with the joint distribution of
discrete random variables. If X and Y are two discrete random variables defined
on a same sample space with probability measure P, the mass function p(x, y)
defined by

p(x, y) = P(X = x, Y = y)

is called the joint probability mass function of X and Y. As noted before, P(X =
x, Y = y) is the probability assigned by P to the intersection of the two sets

313
314 Jt1int(v di:rtributed random variables

Table 11.1. The jllint pnlbabi/ity rnas1 functio11 p(x. )' ).

x\.v 2 3 4 s 6 7 8 9 10 II 12 Px(.r)
2 2
I I
J6 J6 iI 2
J6
2
2
J6 J6
2
0 0 0 0 () II
J6
9
2 0 0 :ii J6
2
:ii J6 i 0 0 0 0 J6
3 0 0 0 () I
J6 Iii i 2
J6 0 () () . 7
J6
~
4 0 0 0 0 0 0 2 2 () 0 !I
J6 J6 J6
s 0 0 () () () 0 0 0 I
J6
l.
36
() .l
J6
I
6 0 0 0 0 0 0 0 0 0 0 :ii ~
l
py(J•) I
J6
2
J6 J6
4
J6
!I
J6
ft
J6 i 4
J6
l
:ii
2
J6
I
:ii sum= I

A = ("' : X(ld) =.rt and B = (w: f(ld) = yl, with"' repmlellting an element
of the sample space. The joint probability 1111155 function uniquely determines
the probability dildributiOlll Px(x) = l'(X = .r) and py(.l') = I'( Y = y) by

px(x) =L, l'(X =x. Y =.v>. pr(Y) =Lx P<X =x. Y =.v).
These dilllributiOlll are called the marginal distributltms of X and Y.
Eample II.I Two fair dice are rolled. Let the random variable X represent
the smallest of the outcome1 of the two rolls. and let Y repraent the sum of
the outcomes of the two rolls. What is the joint probability mass function of X
and f?

Solution. The random variables X and Y are defined on the same sample
space. The sample space is the set of all 36 pain (I, j) for i. j = I •... , 6,
where i and j are the outcomes of the first and second dice. A probabil-
ity of ~ is usigned to each element of the sample space. In Table 11. I,
we give the joint probability 1111155 function p(x. y) = l'(X = x, Y = y).
For example, l'(X =2. Y = S) is the probability of the intersection of
thesetsA=~n~n~~~n~~~n~n~n~~and
B = {(I, 4), (4, I), (2, 3), (3, 2)). The set ((2, 3), (3, 2)1 is the intersection of
these two sets and has probability i·
Problem 11.1 You roll a pair of dice. What is the joint probability mas.• function
of the low and high points rolled?

Problem 11.2 Let X denote the number ofhearts and Y the number of diamonds
in a bridge hand. What is the joint probability mass function of X and Y?
JJ. J Joint probability densities 315

The following example provides a good starting point for a discussion of


joint probability densities.

Example 11.2 A point is picked at random inside a circular disc with radius r.
Let the random variable X denote the length of the line segment between the
center of the disc and the randomly picked point, and let the random variable
Y denote the angle between this line segment and the horizontal axis (f is
measured in radians and so 0 :S Y < 2n-). What is the joint distribution of X
and Y?

Solution. The two continuous random variables X and Y are defined on a


common sample space. The sample space consists of all points ( v, w) in the
two-dimensional plane with v 2 + w2 .::: r 2 , where the point (0, 0) represents the
center of the disc. The probability P(A) assigned to each well-defined subset
A of the sample space is taken as the area of region A divided by TC r 2 • The
probability of the event of X taking on a value less than or equal to a and Y
taking on a value less than or equal to bis denoted by P(X :Sa, Y,::: b). This
event occurs only if the randomly picked point falls inside the disc segment
with radius a and angle b. The area of this disc segment is i:rTCa 2 • Dividing
this by TC r 2 gives
2
P (X :S a, Y :S b) = -2bTC 2ar forO :Sa :Sr andO :Sb :S 211".

We are now in a position to introduce the concept of joint density. Let X


and Y be two random variables that are defined on the same sample space with
probability measure P. The joint cumulative probability distribution function of
X and Yisdefinedby P(X :S x, Y :S y)forallx, y, where P(X :S x, Y.::: y)is
a shorthand for P({w: X(w) :S x and Y(w) :Sy}) and the symbol w represents
an element of the sample space.

Definition 11.1 The continuous random variables X and Y are said to have
a joint probability density function f (x, y) if the joint cumulative probability
distribution function P(X :Sa, Y :Sb) allows for the representation

-oo < a, b < oo,

where the function f(x, y) satisfies

f(x, y) ?: 0 forallx,y and 1_:1_: f(x,y)dxdy = 1.


316

JUlt a in the cw.dilftllllioall cw., /(a, b) IDowt far the iuteatxltldon


/(a,b)411Ab
._. P(a-!411 <X <a+!411,b-!Ab< Y <b+!Ab)
~ 2 - - 2 2 -- 2
far..U pcllidw ¥llw of 41111111 Ab txvvidld dlat /(z, y) ii caadauoa in
the paint (a, b). la odlll' wmdl, the.........,._ the raadam paint (X, Y)
f.uainlDa..U ......... wkblklllof ...... 411, AblRlllDd t.bepoint(a, b)
il.......,..yp..by /(a,b)411Ab.
1b oblldll the joint problbility...., fwlioa /(z, y) of the raadam .......
lblelXllldYinBl...... 11.2,wetaket.be.-Wcleriftlivelof P(X s.r, rs
y) wkb NlplCl ID.1' ... y. It .... foDan 6am
a2
/(z, y) • iii;P(X S .r, Y Sy)

forO < .r < r 111110<1<2'r,


odlerwile.

--·-It
la ....... the joint pnmbility denlity ftancdcm ii faaad by det11minlna
. . the ClllDll. . . . joint pmbabillty dlllrilladma faacdlllll llld takina Mat the
,... clldWldWtL Hawvm; ii ...... lad the joint pnlllability
demlty ftmcdan by uana* problNlildciatelpnllldcm. 1bil ii iUUltrated wkll
tbeam•......_
Bzn ... 11.3 The paialm of a lpi•• of ..... r ii 1p1111 dne Ii-. The
line apina .., pmfa11ned lndepmdenely of ed atber. \Vldl ed lpin. the
painler . . . ••.........,..paint• the cin:le. The raadam Ylrilble L1
CC111....... 1Dthe ....... oft.bem: 6am the tapofthedn:letot.bepoint wbens
the paialll 1tap1C11 theldaapin. The leaadaoft.be m:il IDlliltndclockwile.Let
X • mia(L1e Li, Ls) and Y • ma(L1e Li, Ls). What ii the joint problbility
demlty ftmedoa /(z, y) of the two ccninaGlll raadam wrilblel X IDd Y?
. . . . . WeCID derive the joint pmbabillty deality fulldiaa /(.r, y) by 1llina
the iDtlrpreaadm dlat the problbility P(z < X S .r +Al', 1 < Y S 1 + Ay)
ii appnD"imetely equal to /(z, y)Al'Ay far Al' and Ay ..U. Tbe event
(.r < x s .r +Al',, < r :s 1 + Ay) ocean anly if oae of the L1 am.•
a VII• betwem .r 11111 .r +Al', one of the L1 a-.. betwem 1 IDd y + Ay,
IDd the....-... L1 a._ betwem .randy, wbens 0 < .r < y. Tblle _,
3 x 2 x I • 6 waya ha wbicb L1t £2, Ls CID be ll'faapd IDd the problbility
that for fixed I the raadam Vllilble £1 tms Cll a._ betwem a 11111 b eqalla
I I. I Joint probability densities 317

(b - a)/(2Jrr) for 0 <a < b < 2lfr (explain!). Thus. by the independence of
L 1, L 2• and L 3 (see the general Definition 9.2)

P(x < X :-=: x + ilx. y < Y :-=: y + .!ly)


(x + ilx - x)(y + ily - y)(y -x)
=6 .
2rrr 2lfr 2lfr
Hence. the joint probability density function of X and Y is given by

I
6tv-xl
forO < x < y < 2rrr
j(x, y) = ~2nrr
1

otherwise.

In general, if the random variables X and Y have a joint probability density


function /(x. y)

P((X. Y) EC)= fl f(x, y)dxdy

for any set C of pairs of real numbers. In calculating a double integral over a
nonnegative integrand. it does not matter whether we integrate over x first or
over y first. This is a basic fact from calculus. The double integral can be written
as a repeated one-dimensional integral. The expression for P((X, Y) EC) is
very useful to determine the probability distribution function of any function
g( X. Y) of X and Y. To illustrate this. we derive the useful result that the sum
Z = X + Y has the probability density

/z(z) = 1_: j(u. z - u)du.

To prove this convolution formula; note that

P(Z :-=: z) = JJ f(x. y)dxdy = 1~-oo 1:~~ f(x. y)dxdy


~.tj: .
x+.v:::z

= 1:_co1: 00
f(u. v- u)dudv,

using the change of variables u = x and v = x + y. Next. differentiation of


P(Z :-=: z) yields the convolution formula for /z(z). If the random variables X
and Y are nonnegative. the convolution formula reduces to

/z(z) = 1z f(u, z - u)du for z > 0.


318 Jointly distributed random variables

Uniform distribution over a region


Another useful result is the following. Suppose that a point (X, Y) is picked
at random inside a bounded region R in the two-dimensional plane. Then, the
joint probability density function f(x, y) of X and Y is given by the uniform
density
I
f(x, y) = . for(x,y)eR.
area of region R
The proof is simple. For any subset C s:; R
area of C
P((X, Y) e C) = f R,
areao
being the mathematical definition of the random selection of a point inside the
region R. Integral calculus tells us that area of C =
f fc dxdy. Thus, for any
subset C s:; R

P((X, Y) e C) = f'le( areaoI R dxdy,


f

showing that the random point (X, Y) has the above density f(x, y).
In the following problems you are asked to apply the basic expression
=
P((X, Y) e C) ffc/(x,y)dxdyyourselvesinordertofindtheprobability
density of a given function of X and Y.
Problem 11.3 A point (X, Y) is picked at random inside the triangle consisting
of the points (x, y) in the plane with x, y ~ 0 and x + y ~ 1. What is the joint
probability density of the point (X, Y)? Determine the probability density of
each of the random variables X + Y and max(X, Y).
Problem 11.4 Let X and Y be two random variables with a joint probability
density

= {6~'~
for x, y > c
f(x, y)
otherwise,

for an appropriate constant c. Verify that c = ~ and calculate the probability


· P(X > a, Y > b) for a, b > c.
Problem 11.5 Independently of each other, two points are chosen at random
in the interval (0, I). What is the joint probability density of the smallest and
the largest of these two random numbers? What is the probability density of
the length of the middle interval of the three intervals that result from the two
random points in (0,1)? What is the probability that the smallest of the three
resulting intervals is larger than a?
11.2 Marginal probability densities 319
·......
'•
Problem 11.6 Independently of each other, two numbers X and Y are cho8en
at random in the interval (0, 1). Let Z = X/ Y be the ratio of these two random
numben.
(a) Use the joint density of X and Y to verify that P(Z :S z) equals z for !
0 < z < 1 and equals 1 - 1/(2z) for z ~ 1.
(b) What is the probability that the tint significant (nonzero) digit of Z equals
1? What about the digits 2, ... , 9?
(c) What is the answer to Question (b) for the random variable V = XY?
(d) What is the density function of the random variable (X/Y)U when U is a
random number from (0, 1) that is independent of X and Y?

11.2 Marginal probability densities


If the two random variables X and Y have a joint probability density function
f (x, y ), then each of the random variables X and Y has a probability density
itself. Using the fact that lim,....,.co P(An) = P(lim,....,.co An) for any nondecreas-
ing sequence of events An, it follows that

P(X :Sa)= lim P (X :Sa, Y :Sb)=


hco
0 f
[lco f(x, y)dy] dx.
-co -co
This representation shows that X has probability density function

/x(x) = 1_: /(x, y)dy, -oo < x < oo.

In the same way, the random variable Y has probability density function

fr(y) = 1_: /(x, y)dx, -oo < y < oo.

The probability density functions /x(x) and fr(y) are called the marginal
probability density functions of X and Y. The following interpretation can be
given to the marginal density f x(x) at the point x = a when a is a continuity
point of fx(x). For A.a small, /x(a)A.a gives approximately the probability
that (X, Y) falls in a vertical strip in the two-dimensional plane with width A.a
and around the vertical line x =a. A similar interpretation applies to fr(b) for
any continuity point b of fy(y).
Example 11.4 A point (X, Y) is chosen at random inside the unit circle. What
is the marginal density of X?
Solution. Denote by C ={(x, y) I x 2 + y 2 :S I} the unit circle. The joint prob-
=
ability density function /(x, y) of X and Y is given by f(x, y) l/(area of C)
320

(or(.r, y) EC. Hence

/(.r.y) • {t for(.r,y) EC
odlerwiae.
Using the fact that /(.r. y) i1 equal to J.el'O for thole y milfying r > I - .r2,
if follows that
l.tr=i' -I dy.
and so
/x(.r) •
l-OU
oo
/(.r, y)dy •
-.tr=i'
'Ir

/x(.r)• {tJ1-.r2 ~.r <I


Can you explain why.• .......... density or x ii not the uniform density on
(-1, I)? Hlnr. ma.pr. P(.r < X :S .r + Ar) • the.._ of a verdcal strip in
the unit dn:le.
Pntrl1• 11.7 A point (X. Y) ii ct... at nndam in the equi................
havin1 (0, 0), (I, 0). and ( l. lJl> •
comer pointL Determim the .....-
densidea of X and r. 8efanl ......_.,the f'unr:dan /x<.r>. cm you expl•
why /x(.r)lllUl&be ..... at.r • i?
A general condition for the indepllldence of the jointly distributed madam
vmiables X and Y ia ltllted in Deftnitian 9.2. In renm of the llllllinal dlmitia,
the c:ontinuoua ...... of Rule 9.6 for the dilcrete case is:
Rale 11.1 Tlw jointly dl6trlbllt«l 1"""°"' wulabl,. x and y ate ...,,.,.,,,
I/and on1y I/
/(.r. y) • /x(.r)/r(y) faral/ x, y.
Let UI illlllll'lle this with the rwlom vmiables X and Y from Bxlmple 11.2.
1ben. we obtain fn>m /x(.r) • fo'bt f,t dy that

/x~>· {!' forO < .r < r,


otherwise.
In the ame way. we obtain from /r(y) • /(, f,t d.r that

/r(y) • { ~ : : . : :dr,

1be calculltiom lad to the iDtaidvely obvious result that the ansle Y bu a
uniform distribution on (0, 2'1r). A IOllleWblt men surprising result ii tblt tht:
11.2 Marginal probability den.\'ities 321

listance X and the angle Y are independent random variables. though there is
Jependence between the components of the randomly picked point. The inde-
Jendence of X and Y follows from the observation that .f(:c. y) = .fx(:c)fr(v)
'or all :c. y.
To conclude this subsection, we give a very important result for the expo-
1ential distribution.

~:xample 11.5 Suppose that X and Y are independent random variables, where
X' is exponentially distributed with expected value I /a and Y is exponentially
listributed with expected value I //3. What is the probability distribution of
nin(X. Y)'! What is the probability that X is less than Y'!

iolution. The answer to the first question is that min( X. Y) is exponentially


listributed with expected value I /(a + jl). It holds that
a
P(min(X, Y) .:::: .::) = I - e- 111 +/H: for.::::;: 0 and P<X < Y)= - - .
ex+ P
fhe proof is simple. Noting that P(min(X. Y).:::: .::) = I - P<X > .::. Y > .::).
Ne have

P(min(X. Y) .:::: .::) =I - ! "" f"'


t=: y=: fx(:c)fy(y)t/x dy.

\lso,

Jsing the foct that fl((x) =cu -iu and .fy(y) = pe-fl·''. it is next a matter of
;imple algebra to derive the results. The details are left to the reader.

t>roblem ILi The continuous random vuriables X and Y are nonnegative and
ndependent. Verify thut the density function of Z = X + Y is given by the
:onvolution formula

fz(.::) = t fd.:: - y)fr(v)dy for.::::;: 0.


Jo
>roblem 11.9 The nonnegative random vuriables X and Y are independent and
miformly distributed on (c. ti). What is the probability density of Z = X + Y'!
N'hat is the probability density function of V = X 2 + Y2 ? Use the latter density
o calculate the expected value of the distance of a point chosen at random inside
he unit square to the center of the unit square.
322 J11int/.v di.,1ribu1rd rr111d1m1 t1Qrit1blr.,

11.2.1 Substitution rule


The expected value of a given fun«.1ion of jointly distributed random variables
X and Y can be calculated by the two-dimensional substitution rule. In the
continul>UK ca.'Ce. we have:

Rule 11.2U1/w mndtHll t'UriublC'., X a1td Y l1at~ a j11i111 prrlhubilit.v drnsity


fun<'li1n1 f (x. y ). 1/1r11

E IR<X. Y)f = lnu lnu


~nu _.,.,
R(:c • .\')/(x. y)d:ccl,\'

ft1r u11.v fun('ti1m R(X • .v) pmvidrd 1ht111h' intrRml ;,, Wt!ll drji11r1/.

An euy consequence of Rule 11.2 is that

E(aX +hY) =aE(X) +bE(Y)


for any COOKtantKa. b provided that ECX> and E(Y) exist. Tu see this. note that

1:1:(ax + b,v)/(x, y)dxd,\•

=1~
-au
lnu U.tf(X.,\')dXd,\'
-nu
+/"" lnu
_.,., -nu
b,\'f(X •.\•)d.td.\•

=1,,..
.(•-nu
tl.fd.tlni.i /(.t,y)d,\•+11'11.1
.ra-nu ."•-nu
b.\•d,\·1"'
.(•-""1
/(.f •.\•)d:c

=a 1,,.., xfx(x)dx +bl°"


-a... -oi.
.v/r(.v)d.v.

which proves the desired result It is left to the reader to verify fmm Rules 11.1
and 11.2 that

ECXY) = E(X)E(Y) for independent X and Y.

An illustration of the substitution nale is provided by Problem 221: what


is the expected value of the distance between two points that are chosen at
random in the interval (0, 1)? To answer this question, let X and Y be two
independent random variableathat m uniformly distributed on (0, I). The joint
=
density function of X and Y is given by f (x, y) I for all 0 < x, .v < I. The
11.3 Transformation of random variables 323

substitution rule gives


1 1
E(IX - YI)= fn fn 1x -yldxdy

= 1'() dx [1x(x
0
- y)dy + f
x
1(y -x)dy]

f'(lix 2
+ i1- 1,, ] 1
=Jo ix- - x( 1 - x) dx =
3.
Hence, the answer to the question is i.
As another illustration of Rule 11.2, consider Example 11.2 again. In this
example, a point is picked at random inside a circular disk with radius r and the
point (0, 0) as center. What is the expected value of the rectangular distance from
the randomly picked point to the center of the disk? This rectangular distance is
given by IX cos(Y)I +IX sin(Y)I (the rectangular distance from point (ti, b) to
=
(0, 0) is defined by lt1 I + lb!). Forthe function g(x, y) Ix cos(y )I + Ix sin(y )!,
we find

1
rL~
E[g(X, Y)J = o o
{x!cos(y)I
t
+xlsin(y)IJ~dxdy
rrr

= -rrrI 2 L211' lcos(y)ldy


0
Lr x dx+- L211'
0
2 1
rrr 2 0
lsin(y)ldy Lr
0
x 2 dx

r3 [L211'
= -3rrr 2
I cos(y)I dy + J,211' I sin(y)I dy ] = -8r .
0 0 3rr
The same ideas hold in the discrete case with the probability mass function
assuming the role of the density function

E[g(X, Y)J = L Lg(x, y)p(x, y)


x .\'

when the random variables X and Y have the joint probability mass function
p(x, y) = P(X = x. Y = y). •

11.3 Transformation or random variables


In statistical applications, one sometimes needs the joint density of two random
variables V and W that are defined as functions of two other random variables
X and Y having a joint density f (x, y ). Suppose that the random variables V
= =
and W are defined by V g(X. Y) and W h(X. Y) for given functions g
and h. What is the joint probability density function of V and W? An answer to
324 Jointly distributed random variables

this question will be given under the assumption that the transfonnation is one-
=
lo-one. That is, it is assumed that the equations v g(.r. y) and u1 /1(x. y) =
can be solved uniquely to yield functions x = a(v. u1) and .v b(v. u1). Also=
assume that the partial deriYalives of the functions a(v. u1) and b(a•. u•) wilh
respect to v and u1 are continuous in (v. w). Then the following transfonnali<m
rule holds:
Rule JJ.3 The joint pmhabilit.v density function ti V a11d W i.r R;,,..,, h.\'

/Cu(v, u1), b(v, w))IJ(v, w)I.

wherr the Jut'Obia11 J(v. u•) /.ir gi1ren by tht! dt'tt'nni11a11t


8a(v.u1) au(v.u•)
av au1 au(v. u1) ab(v. u•) au(v. UI) ab(11. UI)
8b(v,w) ab(v.u1)
------- av au1 av
av aw
The proof of this rule ia omitted. This tranafonnation rule ICK>ka intimidating.
but is easy to use in many applications. In the next section it will be HIK>Wn
how Rule 11.3 can be used to devise a method for aimulating from lhe normal
distribution. However. we fint give a 1imple illulllralion of Rule 11.3. Suppose
that X and Y are independent N(O, I) random variablea. Then. the random vari-
= =
ablea V X + Y and W X - Y are normally distributed and independent.
To verify thia, note that the invene function• a(v. u1) and b(a•. u1) are given by
=
x '1.. and y = ri•. Thua. the Jacobian J(v. u•) is equal lo

It !ii=-~.
Since X and Y are independent N(O. I) random variablea. it follows from
Rule 11.1 that their joint deDRity function is given by
I _1..-2 l _1yJ
fx.r(x •.v) = $'" 1 x $e l· • -oo < x • .\' < oo.

Applying Rule 11.3. we obtain that the joint density function of V and W is
given by

r.v w(v w) = _!_,..-!'1¥,i _!_,..-!,.i;c,i x ~


J!. • J2R J2R 2
= I e-!11212 x I e-lr12. -00 < V, UI < 00•
./i.$ ./i.$
=
This implies that fv.w(v, w) fv(v)/w(w) for all v. w with the marginal
=
density functions /v(v) = ~e-1 111 12 and /w(w) JlJ5:.e-i 11il/l, Using
11.3 Transformation of random variables 325

Rule 11.1 again. it now follows that V = X + Y and W = X - Y are N(O, 2)


distributed and independent.

11.3.1 Simulating from a normal distribution


A natural transformation of two independent standard normal random variables
leads to a practically useful method for simulating random observations from
the standard normal distribution. Suppose that X and Y are independent random
variables each having the standard normal distribution. Using Rule 11. l, the
joint probability density function of X and Y is given by
l
j(x, y) = -e-i<.t I
+.v >.
2 ?

2ir
The random vector (X, Y) can be considered as a point in the two-dimensional
plane. Let the random variable V be the distance from the point (0, 0) to the point
(X, Y) and let W be the angle that the line through the points (0, 0) and (X, Y)
makes with the horizontal axis. The random variables V and W are functions
J
of X and Y (the function g(x, y) = x 2 + y 2 and h(x, y) = arctan(y / x )). The
inverse functions t1(v, w) and b(v, w) are very simple. By basic geometry,
x = v cos( w) and y = v sin( w ). We thus obtain the Jacobian

lcos(w)
. ( )
sm w
-v sin(w)
vcos w
I-
2( ) .• 2( ) _
( ) - v cos w + v sm w - v,

using the celebrated identity cos2 ( w) + sin2 ( w) = 1. Hence, the joint probabil-
ity density function of V and W is given by

fv.w(V, w) = 2: e-H112cos2<w1+1hin2<w1) = 2: e-!112

for 0 < v < oo and 0 < w < 2ir. The marginal densities of V and W are given
by

O<V<OO

and

fw(w) = -I
2ir
10
00

ve-2I 11 • dv
t I
= -,
2ir
0 < w < 2ir.

Since fv.w(v, w) =fv(v)fw(w), we have the remarkable finding that V and W


are independent random variables. The random variable V has the probability
I t

density function ve-i 11• for v > 0 and W is uniformly distributed on (0, 2ir).
This result is extremely useful for simulation purposes. Using the inverse-
transformation method from Section 10.3, it is a simple matter to simulate
326 J11int(v distributed rund<Hn variables

random observations from the probability distributions of V and W. If we let


U1 and U2 denote two independent random numben from the interval (0.1). it
follt>WR from results in Section 10.3 that random observation.~ of V and Ware
given by

V = J-21n(I - U1) and W = 2'rU2.

Next. one obtains two random observations X and Y from the standard nonnal
distribution by taking

X = V c:os(W) and Y = V sin(W).


Theoretically. X and Y are independent of each other. However. if a p;cudo-
random generator is uRCd to generate U1 and U2. one uses only one of two vari-
ates X and Y. It surprisingly appears that the points (X. Y) lie on a 5J>iral in the
plane when a multiplicative generator is uRCd for the pseudo-random numben.
The explanation of this subtle dependency lies in the fac..-t that pseudo-random
numben are not truly random. The method described above for generating
normal variates ia known 811 the Box-Muller method.

Pnblelll 11.10 A point ( V, W) ia choKen inside the unit circle 811 foll<>Ws. First.
a number R is choKen at random between 0 and I. Next. a point i11 chollen at
random on the circumference of the circle with radius R. Use the tranRformation
formula to find the joint density function of this point CV. W). What is the
marginal density function of each of the components of the point ( V. W)? Can
you intuitively explain why the point (V, W) is not uniformly distributed over
the unit circle?

Pnblelll ll.11 Let (X, Y) be a point choKen at random inside the unit circle.
Define V and W by V = XJ-21n(Q>/Q and W = Y J-21n(Q)/Q. where
Q = X 2 + Y2• Verify that the random variables V and W are independent and
N (0, I) distributed. This method for generating normal variates is known a.~
Managlia's polar method.

Problem 11.12 The independent random variables Zand Y have a standard


normal distribution and a chi-square distribution with " degrees of freedom.
Use the transformation V = Y and W = Z/~ to prove that the random
variable W = Z / ~ has a Student-I density with " degrees of freedom.
Hint: in evaluating /w(w) from J:'fv.w(v, w)dv, use the fact that the gamma
density>.."x•-•e-u I rca> co.
integrates to 1 over oo).
I I .4 Covariance and correlation coefficient 327

11.4 Covariance and correlation coefficient


Let the random variables X and Y be defined on the same sample space with
probability measure P. A basic rule in probability is that the expected value
of the sum X + Y equals the sum of the expected values of X and Y. Does a
similar rule hold for the variance of the sum X + Y? To answer this question,
we apply the definition of variance. The variance of X + Y equals
E[{X +Y- + Y)} 2 ]
E(X
= E[(X - E(X))2 + 2(X - E(X))(Y - E(Y)) + (Y - E(Y))2 )
= var(X) + 2E[(X - E(X))(Y - E(Y))] + var(Y).
This leads to the following general definition.
Definition 11.2 The covariance cov(X, Y) of two random variables X and Y
is defined by
cov(X, Y) = E[(X - E(X))(Y - E(Y))J
whenever the expectations exist.
The formula for cov(X, Y) can be written in the equivalent form
cov(X, Y) = E(XY) - E(X)E(Y)
by expanding (X - E(X))(Y - E(Y)) into XY - X E(Y) - Y E(X) +
E(X)E(Y) and noting that the expectation is a linear operator. Using the fact
that E(XY) = E(X)E(Y) for independent random variables, the alternative
formula for cov( X, Y) has as direct consequence:
Rule 11.4 /f X and Y are independent random variables, then
cov(X, Y) =0.
However, the converse of this result is not always true. A simple example of
two dependent random variables X and Y having covariance zero is g~ven in
Section 9.4. Another counterexample is provided by the random variables X =
Z and Y = Z 2 , where Z has the standard normal distribution. Nevertheless,
cov(X, Y) is often used as a measure of the dependence of X and Y. The
covariance appears over and over in practical applications (see the discussion
in Section 5.2).
Using the definition of covariance and the above expression for var(X + Y),
we find the general rule:
Rule 11.S For any two random variables X and Y
var(X + Y) = var(X) + 2cov(X, Y) + var(Y).
328

l/IM trmdom varlabla X and Y are indepnt:knt, then


var(X + Y) =var(X) + Yll'(Y).
1be units of cov(X, Y) are not the same u the units of E(X) and E(Y).
TIMnfore, it is often more convenient to use the correlation CMj/icknt of X
and Y which is defined by
X Y)- cov(X, Y) .
P( , - JVii(x)JYiiiiJ')'
providedthatvar(X) > Oandvar(Y) > O. Thecorrelationcoeftk:ientisadimen-
sionleu quantity with the property that
-I :S p(X, Y) :S I.
The reader is uked to prove this property in Problem I I. 14. 1be random vari-
=
ables X and Y are said to be uncorrelated if p(X, Y) O. Independent ran-
dom variables are always uncorreJated. but the convene is not always uue. If
=
p(X, Y) ::1:1, then Y is fully determined by X. In Ibis cue it can be shown
that y - ax + b for coast.ants a and b with a ,,, o.
The problem section of Oapter 5 conwna several exercises on the covari-
ance and coneladon coefticient. Here are 101De more exercises.
Problem 11.13 The continuous random variables X and Y have the jointdenaily
/(.r, y) = 4r forO < x < y < 1 and f<x. y) = Ootherwise. What is the cor-
relation coefficient of X and Y? Can you intuitively explain why this correlation
coeflicient is positive?
Problem 11.14 Verify that
Yll'(aX + b) =a 2Yll'(X) and cov(aX, bY) =abcov(X, Y)
for any constants a, b. Next, evaluate the variance of the random variable Z =
Y/~ - p(X, Y)X/../Vl'd.X) to prove that -I :S p(X, Y) :SI. Also, for
anyconatantsa, b, c,andd, verifythatcov(aX + bY, cV + dW)can be worked
out u accov(X, V) + odcov(X, W) + bccov(Y, V) + bdcov(Y, W).
Problem 11.15 The llDOUDtS of rainfall in Amsterdam durin1 each of the
months January, February, •••, December are independent random variables
with expected values of 62.1, 43.4, .58.9, 41.0, 48.3, 61.S, 65.8, 61.4, 82.J,
8.5.1, 89.0, and 74.9 mm and with standard deviations of 33.9, 27.8, 31.l, 24.1,
29.3, 33.8, 36.8, 32.1, 46.6. 42.4, 40.0, and 36.2 mm. What are the expected
value and the standanl deviadon ofthe ammal rainfall in Amstenlam? Calculate
an approximate value for the probability that the total rainfall in Amsten:lam
next year will be larger than 1,000 mm.
I 1.4 Covariance and correlation coefficient 329

Problem 11.16 Let the random variables X 1, ••• , Xn be defined on a common


probability space. Prove that
n 11 n
var(X 1 + .. · + X,,) = L var(X;) + 2 L L cov(X;, Xj).
i=I i=I j=i+I
Next. evaluate var<2:7= 1 t;X;) in order to verify that 2:7= 1 2:j= 1 t;tiuii 2: 0
for all real numbers ti, ... , tn, where U;j = cov(X;, Xj). In other words. the
=
covariance matrix C (uij) is positive semi-definite.

Problem 11.17 The hypergeometric distribution describes the probability mass


function of the number of red balls drawn when n balls are randomly chosen
from an um containing R red and W white balls. Show that the variance of
the number of red balls drawn is given by n R!wO - R!w>~t::~. Hint: the
number of red balls drawn can be written as X 1 + ... + XR, where X; equals
I if the i th red ball is selected and 0 otherwise.

Problem 11.18 What is the variance of the number of distinct birthdays within
a randomly formed group of 100 persons? Hint: define the random variable X;
as l if the ith day is among the 100 birthdays. and as 0 otherwise.
Problem 11.19 You roll a pair of dice. What is the correlation coefficient of the
high and low points rolled?
Problem 11.20 What is the correlation coefficient of the Cartesian coordinates
of a point picked at random in the unit circle?

11.4.1 Linear predictor


Suppose that X and Y are two dependent random variables. In statistical appli-
cations, it is often the case that we can observe the random variable X but we
want to know the dependent random variable Y. A basic question in statistics
is: what is the best linear predictor of Y with respect to X? That is, for which
=
linear function y a + f3x is
E[(Y - a - {3X)2 ]

minimal? The answer to this question is


Uy
Y = µy + Pxr-(x
ax
- µx).

where µx = E(X), µy = E(f), ux = Jvar(X), uy = y'var(Y), and PxY =


=
p(X, Y). The derivation is simple. Rewriting y a + f3x as y = µ r + f3(x -
µx) - (µy - a - f3µx ), it follows after some algebra that E[(Y - a - ,8X)2]
330 Joint(Y di.Ytributed randtnn variable.,

can be evaluated a."


El(Y-µy -/J(X-µx)+µr -a-/fµxl 21
= El(Y - µ.y - {J(X - µx >1 2 1+(#tr - a - flµ.x )2
+2(µr -a-/fµx>EIY-µy -f:J(X -1tx)I
=ai + {J2cri - 2/IPxrtrxtrr + (It r - a - fJµ x )2•
In order to minimize this quadratic function in a and /f, we put the partial
derivatives of the function with respect to a and /J equal to zero. This leads
after some simple algebra to

,,=-
R Pxrtrr and
a=µy--µx.
Pxrtrr
ax ""
For these values of a and fl, we have the minimal value

E (<Y - a - /JX>2) = ai(I - Pir ).


This minimum is sometimes called the residual variance of Y.
1be phenomenon of rqrruion to IM nwan can be explained with the help
of the best linear predictor. Think of X 81 the height of a 25-year-old father and
think of Y 81 the height his newborn ROD will have at the age of 2S ycarx. It
=
is reuonable to usumc that µx µy =µ..ox= oy o, and p p(X. Y)= =
=
is positive. 1be best linear predictor f' of Y then salisfics f' - µ. p(X - µ.)
with 0 < p < I. If the height of the father scores above the mean, the best
linear prediction is that the height of the ROD will score closer to the mean.
Very tall fathers tend to have somewhat shorter sons and very short fathers
somewhat taller ones! Regression to the mean shows up in a wide variety of
placc5: it helps explain why great movies have often disappointing sequels. and
disas~ presidents have often better succcsson.
Index

Abbott, D., 72 Box-Muller method. 326


absorbing state, 398 branching process, 372
acceptance-rejection method. 360 Brown, R., 184
accessibility, 398 Brownian motion, 184, 205
addition rule, 31, 237 Buffon, 0., 203
Aldous, DJ., 54 Buffon's needle problem. 203, 228
aperiodicity, 407
an:-sine law, 25 Cantor, 0., 224
array method. 58 card shuffte, 54
axioms of probability, 31, 225 Cardano, 0., I
Carson, J., 76
Bachelier, L., 185 Cauchy density, 334, 374
Banach match problem, 283 Caulkins, J.P.• 254
Bayes' rule, 215, 251, 252 central limit theorem, 161, 164, 382
Bayes, T., 6, 215 multidimensional, 342
Bayesian statistics, 7, 198, 256 chance tree, 209
Benford, F., 194 Chapman-Kolmogorov equations, 394
Benford's law, 194 Chebyshev's inequality, 153, 379
Bennett. DJ., 51 Chernoff bound, 379
Bernoulli, D., 39, 44 chi-square density, 305, 378
Bernoulli distribution, 279 chi-square test, 348
Bernoulli experiment, 103, 279 Chu,s .• 119
Bernoulli, J., 2, 21, 103 Chuck-a-Luck, 106
Beny, D.A., 199 closed set, 405
best-choice problem, 14, 35, 59, 69, 99 coefficient of variation, I 58
beta density, 301 coin-tossing, 21, 70, 163, 192, 251, 366
Big-Maningale system, 43 coincidences, 13, 78, 133
binomial coefficient, 417 complement rule, 75, 237
binomial distribution, 105, 279, 369 compound experiment, 29, 231
birthday problem, ll, 15, 74, 97, 114, 371 compound Poisson distribution, 370
bivariate nonnal distribution, 331, 379 conditional expectation, 361
Black, F.• 191 law of, 361
Black-Scholes formula, 189 conditional probability, 83, 244
Boole's inequality, 231 law of, 83, 249, 356
bootstrap method. 93, 347 conditional probability density, 354
Borel-Cantelli lemma. 231 confidence interVal, 172, 307

439
440

con1inuity Ct>rm:tit111, 167 Oalilci. Ci., 2


0
c:unlinuity property. 230 pmbk.'1' 1 falla-y. 17. ll
c:onlinUOUll nmdom variable. 145. 286 pmbler'a furmulL IOO
converpnc:e with prohability one. 20, pmma denllity. 2W. l77
2l4.lHO pmma func.1iun. 300
c:mvolutit111fonnulL277. l17. 321. 3NI Gardner. M.. 37. 2ftK
c:omlalit111 c:oeRlclent. 155. 3211 OllWl!I. C.F.. 144
counaahly infinite. 224 {'JUUllllian di•rihutit111. 302
coupon collector'1 problem. HO. 117. peneraain1 func:tic111. l67
371.3911 ,eometric: Bmwnian motit111. 1119. :ZC»!'
COVllrianc:e. 155.327 ,eometric: dillributitlll. I I• 2110
c:rapc. 12 ,eometric: probability. 221
,eometric: aeric:a. 420
D' Alemhen icyt11em. 4l Oi~.0 .. 217
daupler-tton problem. 15. 62. 212 Cklllllet. w.. )(16
De McW. c.. 2. 2112
De MoivR. A•• 6. 144. 162. 367 llank.-y. J.A.. 112
Dilla:lnhl. P.. 54. 71 Hanner. o.r.. 12
dilc:lece ..... variable, 29. 264 hat-check problem. 116. llH
disjoint .... 31. 225 ha7.anl.. lhl: pme of. 911
douhlin1 ll'*IY• K6 Henze. N.. Ill
Doyle. P..54 Hill. T•• IW.
draft lolaery. 53. 119 hillk,,..... 143
drunbnl'awalk.37. 71.111.277.335.)86 hit-or-miu mcthnd. 6CJ
in dimension dmle. 3H. ll9 houlle perc:enllllC. K6
in dimension two. 311. 335 Huyp:mi. C.• 2. 21. 32
hyperpcimetric: dillrihulicin. 126.
Efron. B.. 93. 2.'16 279.329
Efn111 '1 dic.-e pme. 21
EhrcnfClll mudel. 3119. 409 inc:lllllk11H:!xc:l1111it111n1h:.2.W
cquilihrium di•rihulit111. 4Cl6 independent mmlll. 156. 2411
cquilihrium equaticm. 4CJK independent nmdom variabb. 156. 27'
cquilihrium prubabililiell. 4CtK intena1itin of mmlll. 2.15
Euler. L. 367 invcnic-tl'llmlformatitlll methnd. JCJK
Euler'• COlllllanl. I I
event, 27. 31. 226 J&'Obian. 324
expec:ta1it111. '"expected val~ Jeu de Tn:i7.t:. IOI, 116
expected value. 33. 264. 26., joint probability density, 315
CtlnlillUOUI nmdom variable. 147. 291 joint probability 111111111 func:ait111, J 13
dillc:rele random variable. 33. 2M
ofsum,26R Kadcll. 0 .• 351
exponenaial density. 298 K.hneman. o.. 16. 219
exponential dillributim. 120. 321 Kelly bellintt frac:tit111, 47. 66
exponential func:aitlll, 4.19 Kelly beltin1 1yaem. 44. IHI
exlinc:ait111 prohability. l7J Kelly, J.F.. Jr•• 44
KofmoJonw, A.N•• 21
failure l'llle func:ait111. 310
Fermat, P.. 2. 96 l..abouc:hene ayaem. HI
tin& moment. ur expected value Laplace density. 371
lin&-lllep analyail, 372 Laplace mudel. Jl
franc-caneau. 229 Laplace, P.S.. 2. 144, 162
'·"
law of large numbers, 19, 20, 34, 234, 380 Pareto dcnllity, l1)~
Lazzarini, M., 203 Parrondo 's parudux. 12
Leibniz, G.• 4 Pascal, B.. 2, 42, 96
linear predictor, 329 Paulos, J.A., 76, 183
lognormal distribution, 304, 376 payoff odds, 46
lost boarding pass problem, 56, 71, 251 Pearson, K.• 349
Lotto, 112, 130, 139, 196, 347 Pepy's problem, 134
Lyapunov, A., 162 percentile, 149
permutation, 415
Makholm, H.. 71 Pickover, c.. 212
Maor, E.,419 Poisson distribution, 109, 281, 369
marginal density, 319 Poisson process. 118
Markov chain. 385 simulation of, 124
Markov's inequality, 379 Poisson, S.D., 5, 108
Markovian property, 387 Pollaczek-Khintchine formula. 159
Marsaglia's polar method, 326 positive semi-definite, 329
matching problems, 116, 138, 239 posterior probability, 215, 252
Matthews, R.• 202 Powerball Lottery, 130
McKean. K., 16 prior probability, 215, 251
mean, see expected value prisoner's problem, 210
median, 295 probability density function, 145, 286
memoryless property. 120. 299 probability distribution function, 286
Mendel, G., 350 probability mass function, 29, 264
Merton, R., 19 l probability measure, 27. 31, 225
Merz. J.F., 254 probability space, 27, 31, 226
Metropolis, N., 61 product rule. 30, 231
mixed random variable, 270
moment-generating function. 374 Quetelet, A.. 145
Monte Carlo simulation, 73 queues, 158
Montmort, M., 116
Monty Hall dilemma. 15, 2ITT. 217 random number, 51
Mosteller, F.• 78 random permutation, 59
multinomial distribution, 136 random sum, 370
multivariate normal distribution, 339 random variable, 28, 263
simulating from, 341 random walk, 23, 37. 177, 335
mutually exclusive events, 226 random-number generator, 50
rate of return, 151
Nauss, J.I., 77 Rayleigh distribution, 139, 338
negative binomial distribution, recurrent state, 410
280,369 regression curve, 362
Newcomb, s.. 194 regression to the mean, 330
Newton, I.. 134 Richards, D., 40
nonnal curve, 144 Riedwyl, H., 132
normal density, 147, 302 roulette, 42, 86, 179, 403
normal distribution, 147, 303, 376 runs, 27. 192, 399
bivariate, 331
multivariate, 339 Salk vaccine, 168
simulating from, 325 Samford. M., 215
sample mean, 170
odds, 252 sample space, 19, 27. 224
odds form of Bayes' rule, 252 sample variance, 171
442 llrdex

Sav• • s.. ISi trianttular dcn11ity. 297


Sc:hula. M.• 191 Tvenky. A•• 16
*=l'lllc:h-and-win klllft')'. 12. I IS
Sil11p!IOO. OJ.. 2.14 Ulam.S•• 61
11imulalion. SS. 7l. 170 UOC.'\tUOlahk', 224
Skwk:. P..16 undm& 111111 &Mnl. pmc or. l4
llqlllft-fOlll law. 16ft unironn dcnllily. :?w.
St. Petenilufl pndltx. l'I unironn dilllrihutictn. !i I. 2110
....... devialion. l!W. 271 union or nent11. l.1.~
andanl normaldilltrihution. 148. 191. lOl
lllalilllic:al equilibrium. 4ft7 Yllrianc."C. l!il.271.29l
Steutel, F.. l69 \\tn Bonkiewic.'Z. L. 111
Student-I dcnllity. l06 voa Savanl. M.. 116. 2011
1111hililuaion rule. 271. lll
Sz&ely. o •• 40 Wllilintt-tlme pndla. IS7
Weibull dcnllity. lCH
l.abu prubllhilily. 402 Wei1111. 0.11.. 37
lellpmndox.212 Whitworth. W.A.. 44
Tllller. R.H•• 26 Wiener. N•• 1117
Thorp. E.O•• 49
1ime-mlenibilily. 41 l Ym Yq Solilain:. !W
ll'mlld'Ol'lllllion. JOH. 32.1 YMllllkcr. D.. l!i I
h'anllienl .... 410
ll'Mllilion prubllhililicll. 381. 394 :-value. 1611

You might also like