170 Are View
170 Are View
Definition (Permutation)
An ordering of n distinct objects x1 , x 2 ,..., x n is called a permutation of
x1 , x 2 ,..., x n .
Definition (Factorial)
We define 0! =1
n factorial = n!= n(n −1)...( 2)(1) for n =1,2,3,...
Proposition
There are n! permutations of n distinct objects
Proposition
Suppose that a sequence S of n items has n1 identical items of type 1, n 2
identical items of type 2, …, and n k identical items of type k . Then, the
n!
number of orderings of S is .
n1 ! n 2 !... n k !
Definition ( k -combination)
Let x1 , x 2 ,..., x n be n distinct elements. A k-combination of x1 , x 2 ,..., x n is
a k-element subset of the set { x1 , x 2 ,..., x n } .
n n!
Further, there are
= k-combinations of n distinct elements.
k ( − ) kkn ! !
Definition (Binomial Coefficient)
For k ≤ n , we define the binomial coefficient as follows:
n n!
=
k ( − ) kkn ! !
n
For k > n , we set
= 0
k
Identities
n n
(a )
=
k n− k
n − n−11
(b )
= +
k k−1
Theorem (The Binomial Theorem)
n n k n− k n
(x+ y) = ∑ yx , n =1,2,3,...
k= 0 k
Theorem (Trinomial Theorem)
n!
( x + y + z) n = ∑ x i y j z k
i, j,k≥ 0 i! j!k!
i+ j+ k = n
Definition (Sample Space)
The sample space is the set of all possible outcomes. Denoted S.
Definition (Event)
An event is a subset of the sample space.
Proposition
Let S be a finite sample space, and suppose that all outcomes are equally likely.
E
Then, for all E ⊂ S , P( E ) =
S
.
Proposition
If E ⊂ F ⊂ S , P ( E ) ≤ P( F )
Proposition
complement of E.
Proposition
For all events E , F ⊂ S P ( E F ) = P ( E ) + P( F ) − P ( EF )
n
− + ... + (−1) n +1 PEi
i =1
Proposition
Let A, B ⊂ S be events such that 0 < P ( B) <1. Then,
P ( A) = P ( A | B ) P ( B ) + P ( A | B c ) P ( B c )
(b ) Bi B j = 0/ for all i ≠ j
n
(c ) Bi = S .
i =1
Then, P ( A) = P ( A | B1 ) P ( B1 ) + P ( A | B2 ) P ( B2 ) + ... + P ( A | Bn ) P ( Bn )
n
= ∑ P ( A | Bi ) P ( Bi )
i =1
Definition (Partition)
Let B1 , B2 ,..., Bn ⊂ S be events such that
(a ) 0 < P ( B ) <1
(b ) Bi B j = 0/ for all i ≠ j
n
(c ) Bi = S .
i =1
{B1 ,..., Bn } is called a partition of S .
Definition (Independent)
Events A, B ⊂ S are called independent if P ( AB ) = P ( A) P ( B ) .
Proposition
Let A, B ⊂ S be events such that P ( B ) > 0 . Then, A and B are independent
if and only if P ( A | B ) = P ( A) .
Proposition
If A and B are independent events, then A and B c are independent.
Definition (Independent)
Events A1 , A2 ,..., An are said to be independent if P ( Ai1 Ai2 ... Aik )
= P ( Ai ) P ( Ai )... P ( Ai ) for all 1 ≤ i1 < i2 < ... < i k ≤ n , 2 ≤ k ≤ n.
1 2 k
Fact
Let X have distribution F . Then
(a ) P ( X > x) = 1 − P ( X ≤ x) = 1 − F ( x) for all x ∈ R.
(b ) P ( x < X ≤ y ) = F ( y ) − F ( x ) for all y > x
Proposition
Let X be a discrete random variable with probability mass function p .
For any function g : R → R , E[ g ( X )] = x∈∑ g ( x) p( x)
D( p)
Definition (Variance)
If X is a random variable with a finite mean µ , we define the variance of X
by:
Var ( X ) = E[( X − µ) 2 ] .
Proposition
Var ( X ) = E[ X 2 ] − ( E[ X ])
2
Proposition
For all a, b ∈R , var( aX + b) = a 2 var( X )
n k n− k
p(k) = P(X = k) = p (1− p) for all k =0,1,..., n .
k
Proposition
If X ~ Binomial ( n, p ) , then E ( X ) = np and Var ( X ) = np (1 − p )
Proposition
λk
Let X n ~ Binomial (n, λ n) . Then, for all k ≥ 0 , lim P ( X n = k ) = e −λ
n →∞ k!
Fact
λk
p (k ) = e −λ for k = 0,1,2,... defines a probability mass function.
k!
Proposition
If X ~ Poisson (λ) , then E [ X ] = var ( X ) = λ .
Fact
p ( k ) = (1 − p ) k −1 p , k ≥ 1 defines a probability mass function
Definition (Geometric(p)-distributed)
Let p ∈( 0,1) . A random variable with the probability mass function
p (k ) = (1 − p ) p is called geometric ( p ) -distributed with parameter p.
k
Proposition
1 1− p
If X ~ geometric ( p ) , then E [ X ] = p and var ( X ) = p 2 .
Fact
Let X be a continuous random variable. Then, P( X = a ) = 0 for all a ∈ R.
1
x ∈(α, β)
function is given by f ( x) = β − α
0 otherwise
Proposition
Let X be a continuous random variable. Its cumulative distribution function F
has the following properties:
(a ) F is increasing
(b ) lim F ( x) = 0 , lim F ( x) = 1
x →−∞ x →∞
(c ) F ' ( x ) = f ( x ) where f is the density of X .
Definition (Expectation)
We defined the expectation of a continuous random variable X with density f
by:
E [ X ] = ∫ xf ( x) dx whenever this integral makes sense.
∞
−∞
Proposition
Let X be a continuous random variable with density function f X , and let
Then E [ g ( X )] = ∫−∞ g ( X ) f X ( x) dx whenever the integral makes
∞
g :R →R .
sense.
Lemma
For a non-negative random variable Y , E [Y ] = ∫0 P(Y > y ) dy .
∞
Corollary
For all a, b ∈R , E[ aX + b] = b + aE [ X ]
λ e− λ x
f ( x) = x ≥0
else
0
Proposition
1 1
Let X ~ exponential (λ) . Then, E [ X ] = , Var ( X ) = .
λ λ2
Theorem
Let X be a continuous random variable with density f X . Let g be a
continuous, strictly monotone function. Then the random variable Y = g ( X ) has
the following density:
−1 d −1
f X ( g ( y) ) g ( y)
fY ( y) = dy
[ ] y = g ( x ) for some x
0 else
Definition
We say that X is normally distributed with parameters µ and σ 2 if the density
is given by the following:
2
1 x −µ
1 −
f ( x) = e 2 σ
, x ∈R .
2π σ2
Fact
2
1 x −µ
1 −
f ( x) = e 2 σ
, x ∈ R is a probability density
2π σ2
Proposition
Let X ~ N ( µ, σ 2 ) . Then, E [ X ] = µ , var ( X ) = σ 2 .
Theorem
Suppose M (t ) is finite on some open interval containing the origin. Then,
(a ) E [ X ] = M ' ( 0)
(b ) For all k ≥ 1 , E [ X k ] = M (k ) (0)
pY ( y ) = ∑ p ( x , y )
x∈∆
Fact
If X and Y are jointly continuous, then both X and Y are continuous
random variables.
Definition (Independent)
Two random variables X and Y are called independent if for all intervals
A, B ⊂ R ,
P ( X ∈ A, Y ∈ B ) = P ( X ∈ A) P (Y ∈ B )
Proposition
(a ) Discrete random variables X and Y are independent if and only if
p ( x, y ) = p X ( x) pY ( y ) for all x, y .
(b ) Random variables X and Y with a joint density function f ( x, y ) are
independent if and only if f ( x, y ) = f X ( x) f Y ( y ) for all x, y .
Proposition
Let X and Y be two independent, continuous random variables with densities
f X , f Y . Then, X + Y is a continuous random variable with density function:
∞
f X +Y ( a ) = ∫ f X ( a − y ) f Y ( y ) dy
−∞
Proposition
Let X 1 and X 2 be normally distributed with parameters µ1 , σ 1 and
2
( )
( µ ,σ ) respectively. If X and Y
2 2
2
are independent, then
X + Y ~ N ( µ + µ ,σ + σ ) .
2 2
1 2 1 2
Proposition
If X and Y are independent and Poisson-distributed with parameters λ and
µ respectively, then X + Y ~ Poisson ( λ + µ ) .