0% found this document useful (0 votes)
53 views27 pages

Lecture2 Probablity

This document defines and provides examples of key concepts related to continuous random variables: - The probability density function (pdf) of a continuous random variable must be nonnegative and integrate to 1 over the real number line. The probability of an event is given by the area under the pdf curve over the relevant interval. - The cumulative distribution function (cdf) gives the probability that a random variable is less than or equal to a value and can be found by taking the antiderivative of the pdf. - The expected value of a continuous random variable is defined by an integral of the random variable weighted by the pdf. Variance and standard deviation are defined similarly to discrete random variables. - Examples are provided

Uploaded by

atifjaved91
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views27 pages

Lecture2 Probablity

This document defines and provides examples of key concepts related to continuous random variables: - The probability density function (pdf) of a continuous random variable must be nonnegative and integrate to 1 over the real number line. The probability of an event is given by the area under the pdf curve over the relevant interval. - The cumulative distribution function (cdf) gives the probability that a random variable is less than or equal to a value and can be found by taking the antiderivative of the pdf. - The expected value of a continuous random variable is defined by an integral of the random variable weighted by the pdf. Variance and standard deviation are defined similarly to discrete random variables. - Examples are provided

Uploaded by

atifjaved91
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 27

RANDOM VARIABLES

Definition and Basic Properties


The pdf of a continuous random variable X
must satisfy two conditions.
It is a nonnegative function (but unlike in the
discrete case it may take on values exceeding
1).
Its definite integral over the whole real line
equals one. That is

f ( x)dx 1

Definition and Basic Properties


The pdf of a continuous random variable X must satisfy
three conditions.
Its definite integral over a subset B of the real
numbers gives the probability that X takes a value in
B. That is,

f ( x) P( X B)

for every subset B of the real numbers. As a special


case (the usual case) for all real numbers a and b
b

f ( x)dx P (a X b)

Put simply, the probability is simply the area under


a

the pdf curve over the interval [a,b].

Definition and Basic Properties


Note that by this definition the probability of X
taking on a single value a is always
0. This
a
follows from P( X a) P(a X a) a f ( x)dx 0
, since every definite integral over a
degenerate interval is 0. This is, of course,
quite different from the situation for discrete
random variables.

Definition and Basic


Properties
Examples
Let X be a random variable with range [0,2] and pdf
defined by f(x)=1/2 for all x between 0 and 2 and f(x)=0
for all other values of x. Note that since the integral of
zero is zero we get

f ( x)dx 1/ 2dx
0

1
x 1 0 1
2 0

That is, as with all continuous pdfs, the total area under
the curve is 1. We might use this random variable to
model the position at which a two-meter with length of
rope breaks when put under tension, assuming every
point is equally likely. Then the probability the break
occurs in the last 2half-meter2 of the rope2 is
P (3 / 2 X 2)

3/ 2

f ( x) dx 1/ 2dx
3/ 2

1
x 1/ 4
2 3/ 2

Definition and Basic


Properties
Examples
Let Y be a random variable whose range is the
nonnegative reals and whose pdf is defined by
1 x / 750
f ( x)
e
750
for nonnegative values of x (and 0 for negative
values of x). Then

f ( x)dx

lim e
t

x / 750 t
0

t
1 x / 750
e
dx lim e x / 750 dx
t 0
750

lim e 0 e 750 / t 1 0 1
t

Definition and Basic


Properties
The random variable Y might be a
reasonable choice to model the lifetime in
hours of a standard light bulb with average
life 750 hours. To find the probability a bulb
lasts under 500 hours, you calculate
P (0 Y 500)

500

1 x / 750
x / 750 500
e
dx e
e 2 / 3 1 0.487
0
750

Cumulative Distribution
Functions
The cdf of a continuous random variable has
the same definition as that for a discrete
random variable. That is,

F ( x) P ( X x)

In practice this means that F is essentially a


particular antiderivative of the pdf since
F ( x) P( X x)

f (t )dt

Thus at the points where f is continuous


F(x)=f(x).

Cumulative Distribution
Functions
Knowing the cdf of a random variable greatly
facilitates computation of probabilities
involving that random variable since, by the
Fundamental Theorem of Calculus,

P(a X b) F (b) F (a )

Cumulative Distribution
Functions
In the second example above, F(x)=0 if x is
negative and for nonnegative x we have
F ( x)

1 t / 750
t / 750 x
e
dt e
e x / 750 1 1 e x / 750
0
750

Thus the probability of a light bulb lasting


between 500 and 1000 hours is
F (1000) F (500) (1 e 1000 / 750 ) (1 e 500 / 750 ) e 2 / 3 e 4 / 3 0.250

Cumulative Distribution
Functions
In the first example above F(x)=0 for negative x,
F(x)=1 for x greater than 2 and F(x)=x/2 for x
between 0 and 2 since for such x we have
x

1
1
F ( x) 1/ 2dt t x
0
2 0 2
x

Thus to find the probability the rope breaks


somewhere in the first meter we calculate F(1)F(0)=1/2-0=1/2, which is intuitively correct.

Cumulative Distribution
Functions
If X is a continuous random variable, then its
cdf is a continuous function. Moreover,
lim F ( x) 0

and

lim F ( x) 1
x

Again these results are intuitive

Expectation and Variance


Definitions
The expected value of a continuous random
variable X is defined by

E ( X ) xf ( x)dx

Note the similarity to the definition for discrete


random variables. Once again we often denote it
by . As in the discrete case this integral may
not converge, in which case the expectation if X
is undefined.

Expectation and Variance


Definitions
As in the discrete case we define the variance
by

Var ( X ) E (( X ) )
2

Once again the standard deviation is the square


root of variance. Variance and standard
deviation do not exist if the expected value by
which they are defined does not converge.

Expectation and Variance


Theorems
The Law Of The Unconscious Statistician holds
in the continuous case. Here it states

E (h( X )) h( x) f ( x)dx

Expected value still preserves linearity. That is

E (aX b) aE ( X ) b
The proof depends on the linearity of the definite
integral (even an improper Riemann integral).

Expectation and Variance


Theorems
Similarly the expected value of a sum of functions of X
equals the sum of the expected values of those functions
(see theorem 4.3 in the book) by the linearity of the
Var (aX b) a 2Var ( X )
definite integral.
The shortcut formula for the variance holds for
continuous random variables, depending only on the two
preceding linearity results and a little algebra, just as in
the discrete case. The formula states
Var ( X ) E ( X 2 ) E ( X ) 2 E ( X 2 ) 2

Variance and standard deviation still act in the same way


on linear functions of X. Namely
and

SD(aX b) a SD( X )

Expectation and Variance


Examples
In the two-meter-wire problem, the expected
value should be 1, intuitively. Let us calculate:

E( X )

21
1 2
1
x dx x dx x 1 0 1
0 4
4 0
2

Expectation and Variance


Examples
In the same example the variance is
Var ( X ) E ( X ) 1
2

1 3
1
1
x dx 1 x 1
6 0
3
2
2

and consequently

1
1
3
SD( X )

0.577
3
3
3
This result seems plausible.

Expectation and Variance


Examples
It is also possible to compute the expected value
and variance in the light bulb example. The
integration involves integration by parts.

You might also like