0% found this document useful (0 votes)
161 views15 pages

Booklet For Exam

The document defines key concepts in probability and statistics, including: 1) Probability is defined as a function that maps events to real numbers between 0 and 1, following three axioms. 2) Common probability distributions are described, including the binomial, Poisson, geometric, and exponential distributions. These define the probability of different outcomes. 3) Conditional probability is the probability of an event A given that another event B has occurred, written as P(A|B). Bayes' theorem allows calculating conditional probabilities.

Uploaded by

xyzabcutube
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
161 views15 pages

Booklet For Exam

The document defines key concepts in probability and statistics, including: 1) Probability is defined as a function that maps events to real numbers between 0 and 1, following three axioms. 2) Common probability distributions are described, including the binomial, Poisson, geometric, and exponential distributions. These define the probability of different outcomes. 3) Conditional probability is the probability of an event A given that another event B has occurred, written as P(A|B). Bayes' theorem allows calculating conditional probabilities.

Uploaded by

xyzabcutube
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

P [ AB ] = P [ A] P [ B ]

Probability
Bayes' Theorem:
Definition
P[ A | B]P[ B]
Outcome: An outcome of an experiment is any P[ B | A] =
possible observation of that experiment. P[ A]
P[ A | Bi ]P[ Bi ]
Sample Space: The sample space of an P [ Bi | A] = m
experiment is the finest-grain, mutually
exclusive, collectively exhaustive set of all ∑ P[ A | Bi ]P[ Bi ]
i =1
possible outcomes.
Theorem:
Event: An event is a set of outcomes of an The number of k-permutations of n
experiment. distinguishable objects is
De Morgan's law: ( n )k = n ( n − 1)( n − 2 )L ( n − k + 1)
( A ∪ B)
c
= Ac ∩ B c n!
=
(n − k )!
Axioms of Probability:
A probability measure P[.] is a function that The number of k-combinations of n
maps events in the sample space to real numbers distinguishable objects is
such that ⎛ n⎞ n!
⎜⎜ ⎟⎟ =
⎝ k ⎠ k!(n − k )!
Axiom 1. For any event A, P A ≥ 0 .[ ]
[ ]
Axiom 2. P S = 1 .
Axiom 3. For any countable collection of A1, Discrete Random Variables
A2… of mutual exclusive events
P [ A1 ∪ A2 ∪ L] = P [ A1 ] + P [ A2 ] + L Expectation:
E[ X ] = μ X = ∑ xP ( x)
x∈S X
X
Theorem:
For any event A, and event space Theorem:
{B1 , B2 ,K , Bm } , Given a random variable X with PMF PX(x) and
the derived random variable Y=g(X), the
m
P [ A] = ∑ P [ A ∩ Bi ] . expected value of Y is
i =1 E[Y] = μY = ∑ g ( x) P
x∈SX
X ( x)

Conditional Probability:
P[ AB ] Variance:
P[ A | B] =
Var[ X ] = σ X = E ⎡( X − μ X ) ⎤
2
P[ B] 2
⎣ ⎦
Law of Total Probability:
If B1, B2, B3, …, Bm is an event space and P[Bi] Var[ X ] = E ⎡⎣ X 2 ⎤⎦ − μ X2
> 0 for i = 1, …, m. then
= E ⎡⎣ X 2 ⎤⎦ − ( E [ X ])
m 2

P[ A] = ∑ P[ A | Bi ]P[ Bi ]
i =1
Theorem:
Independent Events: If Y = X + b, Var[Y ] = Var[ X ].
Event A and B are independent if and only if
Theorem:

1/15 Version 1 – January 2009


If Y = aX , Var[Y ] = a 2Var[Y ]. k k (1 − p )
E[ X ] = Var[ X ] =
p p2
Bernoulli RV

For 0 ≤ p ≤ 1, Discrete Uniform RV

⎧1 − p x=0 For integers k and l such that k < l,



PX ( x) = ⎨ p x =1
⎧ 1
⎪0 ⎪ x = k , k + 1, k + 2,K , l
⎩ otherwise PX ( x) = ⎨ l − k + 1
⎪⎩0 otherwise
E[X] = p Var[X] = p(1 - p)

E[ X ] =
k +l
, Var[ X ] =
( l − k )( l − k + 2 )
Binomial RV 2 12

For a positive integer n and 0 ≤ p ≤ 1, Poisson RV

For α < 0,
⎧⎛ n ⎞ x
⎪⎜ ⎟ p (1 − p )
n− x
x = 0,1, 2,K , n ⎧α x e −α
PX ( x) = ⎨⎝ x ⎠ ⎪ x = 0,1, 2K
PX ( x) = ⎨ x !
⎪0
⎩ otherwise ⎪0 otherwise

E[X] = np Var[X] = np(1 - p)
E[ X ] = α , Var[ X ] = α

Geometric RV Note: For some application α = λ t

For 0<p≤1,
Misc:

⎧⎪ p (1 − p ) x −1 1
PX ( x) = ⎨
x = 1, 2,K ∑ xq x −1
=
(1 − q )
2
x =1
⎪⎩0 otherwise
N −1
1−α N
1 1− p ∑ α =n

1−α
E[ X ] = Var[ X ] = n =0

p p2

Pascal RV Conditional Probability:

For 0<p<1, Conditional PMF:


Given the event B, with P B > 0 , the [ ]
⎧⎛ x − 1⎞ k conditional probability mass function of X is
p (1 − p )
x −k
⎪⎜ ⎟ x = k , k + 1,L
PX ( x) = ⎨⎝ k − 1⎠
⎪0 PX |B ( x ) = P [ X = x | B ]
⎩ otherwise
Theorem:
A random variable X resulting from an
experiment with event space B1,…Bm has PMF

2/15 Version 1 – January 2009


⎧ 1
m ⎪ a< x<b
f X ( x) = ⎨ b − a
PX ( x ) = ∑ PX |Bi ( x )P[ Bi ] ⎪⎩0
i =1 otherwise
Theorem:
The conditional PMF PX | B ( x ) of X given B (b − a )
2
a+b
E[ X ] = , Var[ X ] =
satisfies 2 12
⎧ Px ( x)
⎪ x∈B
PX |B ( x ) = ⎨ P[ B] Exponential RV
⎪0 otherwise For λ > 0,

⎧λ e − λ x x≥0
Conditional Expected Value: f X ( x) = ⎨
The conditional expected value of random ⎩0 otherwise
variable X given condition B is
1 1
E[ X ] = Var[ X ] =
E[X|B] = μX|B = ∑
x∈B
xPX|B(x) λ
,
λ2

Theorem: Erlang RV
For a random variable X resulting from an
experiment with event space B1,…,Bm For x > 0, and a positive integer n,
m
⎧ λ n x n −1e − λ x
E[X] = ∑ E[ X Bi ]P[ Bi ]
i =1

f X ( x) = ⎨ ( n − 1) !
x≥0
⎪0 otherwise

Continuous Random Variables n n


E[ X ] = , Var[ X ] =
λ λ2
CDF: FX ( x ) = P X ≤ x [ ]
Gaussian RV
dFX ( x )
PDF: f X ( x ) = For constants σ > 0, −∞ < μ < ∞ ,
dx
⎧⎪ e−( x − μ ) / 2σ 2
2

Theorem: f X ( x) = ⎨ −∞ < x < ∞


P [ x1 < X ≤ x2 ] = ∫ f X ( x )dx ⎪⎩ σ 2π
x2

x1

Expectation: E[ X ] = μ , Var[ X ] = σ 2

E [ X ] = ∫ xf X ( x )dx
−∞

Uniform RV - Continuous Standard Normal Random Variable Z:


x−μ
For constant a < b z=
σ
Standard Normal CDF:

3/15 Version 1 – January 2009


1 z PX ,Y ( x, y ) = P[ X = x, Y = y ]
Φ( z) = ∫
2
e − u / 2 du
2π −∞

Standard Normal Complementary CDF:


1 ∞ Marginal PMF:
Q ( z ) = P[ Z > z] = ∫ e−u / 2du = 1 − Φ ( z )
2

PX ( x) = ∑ PX ,Y ( x, y )
2π z
y

Theorem: PY ( y ) = ∑ PX ,Y ( x, y )
Given random variable X and a constant a > 0, x

the PDF and CDF of Y = aX are Expectation:


The expected value of the discrete random
variable W = g(X,Y) is
1 y
fY ( y ) = fX ( ) E [W ] = ∑ ∑ g ( x, y ) P ( x, y )
a a x∈S X y∈SY
X ,Y

y
FY ( y ) = FX ( )
a The expectation of
g ( X , Y ) = g1 ( X , Y ) + L + g n ( X , Y ) is
E [ g ( X , Y ) ] = E [ g1 ( X , Y )] + L + E [ g n ( X , Y ) ]
Theorem:
Given random variable X, the PDF and CDF of
V = X + b are
fV (v) = f X (v − b) Expectation of the sum of 2 RVs:
E [ X + Y ] = E [ X ] + E [Y ]
FV (v) = FX (v − b)
Variance of the sum of 2 RVs:
Var[ X + Y ]
= Var [ X ] + Var [Y ] + 2 E ⎡⎣( X − μ X )(Y − μY ) ⎤⎦
Pairs of Random Variables
= Var [ X ] + Var [Y ] + 2Cov [ X , Y ]
Joint CDF:
FX ,Y ( x, y ) = P [ X ≤ x, Y ≤ y ] Covariance:
x y
Cov [ X , Y ] = E ⎡⎣( X − μ X )(Y − μY ) ⎤⎦
FX ,Y ( x, y ) = ∫ ∫ f X ,Y (u , v)dvdu
−∞ −∞
Correlation:
Joint PDF: rXY = E [ XY ]
∂ FX ,Y ( x, y )
2
Cov [ X , Y ] = E [ XY ] − μ X μY
f X ,Y ( x, y ) =
∂x∂y
Correlation Coefficient:
Marginal PDF: Cov [ X , Y ]
f X ( x) = ∫

f X ,Y ( x, y )dy ρ XY =
−∞ Var[ X ]Var[Y ]

fY ( y ) = ∫ f X ,Y ( x, y )dx
−∞
Sums of Random Variables
Independence:
f X ,Y ( x, y ) = f X ( x) fY ( y ) Wn = X 1 + L + X n

Joint Probability Mass Function: Expectation value of Wn:

4/15 Version 1 – January 2009


E [Wn ] = E [ X 1 ] + L + E [ X n ] Var [ R ] = E [ N ]Var [ X ] + Var [ N ] ( E [ X ])
2

Variance of Wn:
n n n
Var [Wn ] = ∑ Var [ X i ] + 2∑ ∑ Cov ⎡⎣ X , X
i j
⎤⎦
i =1 i =1 j =i +1

PDF of W = X + Y: Central Limit Theorem Approximation:


∞ Let Wn = X 1 + L + X n be an iid random sum
fW ( w) = ∫ f X ,Y ( x, w − x)dx
−∞ with E [ X ] = μ X and Var [ X ] = σ X2 , The CDF

=∫ f X ,Y ( w − y, y )dy of Wn may be approximated by
−∞
⎛ w − nμ ⎞
FWn ( w ) ≈ Φ ⎜ ⎟
x .
⎜ nσ 2 ⎟
PDF of W = X + Y, when X and Y are ⎝ X ⎠

independent:

fW ( w) = ∫ f X ( x) fY ( w − x)dx Stochastic Process
−∞

=∫ f X ( w − y ) fY ( y )dy The Expected Value of a Process:
−∞
μ X (t ) = E ⎡⎣ X ( t ) ⎤⎦
Moment Generating Function (MGF):
φ X ( s ) = E ⎡⎣e sX ⎤⎦ Poisson Process of rate λ:
⎧ ( λT )n e − λT
MGF satisfies: φ X ( s ) |s = 0 = 1 ⎪ n = 0,1, 2L
PN (T ) ( n ) = ⎨ n!
⎪0 otherwise
The MGF of Y = aX + b satisfies: ⎩
φY ( s ) |= e sbφ X (as )
Theorem:
The nth moment: For a Poisson process of rate λ, the inter-arrival
times X1, X2,… are an iid random sequence
d nφ X ( s )
E ⎡⎣ X n ⎤⎦ = |s = 0 with the exponential PDF
ds n ⎧λ e − λ x x≥0
f X ( x) = ⎨
Theorem: For X 1 ,L , X n a sequence of ⎩0 otherwise
independent RVs, the MGF of
W = X 1 + L + X n is Autocovariance:
φW ( s ) = φ X ( s )φ X ( s )Lφ X ( s ) C X ( t ,τ ) = Cov ⎡⎣ X ( t ) , X ( t + τ ) ⎤⎦
1 2 n

For iid X 1 ,L , X n , each with MGF φ X ( s) , Theorem:


C X (t ,τ ) = R X (t ,τ ) − μ X (t ) μ X (t + τ )
the MGF of W = X 1 + L + X n is
φW ( s ) = [φ X ( s ) ]
n
Autocorrelation:
RX ( t ,τ ) = E ⎡⎣ X ( t ) X ( t + τ ) ⎤⎦
Random Sums of independent RVs:
R = X1 + L + X N Stationary Process:
φ R ( s ) = φ N ( ln φ X ( s ) ) f X (t1 )L X (tm ) ( x1 ,K , xm ) = f X (t1 +τ )L X ( tm +τ ) ( x1 ,K , xm )

E [ R] = E [ N ] E [ X ] Properties of stationary process:


μ X (t ) = μ X

5/15 Version 1 – January 2009


RX ( t ,τ ) = RX ( 0,τ ) = RX (τ ) ∞
SX ( f ) = ∫ R (τ ) e
− j 2π f τ

C X (t ,τ ) = R X (τ ) − μ X2 = C X (τ )
X
−∞

RX (τ ) = ∫ S ( f )e
X
j 2π f τ
df
−∞
Wide Sense Stationary (WSS) Random
Process:
Theorem: For WSS process X(t), the PSD SX(f)
E [ x(t ) ] = μ X is a real valued function with the following
RX ( t ,τ ) = RX ( 0,τ ) = RX (τ ) properties:

Properties of ACF for WSS RP:


SX ( f ) ≥ 0
RX ( 0 ) ≥ 0

∫S [
( f )df = E X 2 (t ) = R X (0)]
RX (τ ) = RX ( −τ )
X
−∞

RX (τ ) ≤ RX ( 0 ) S X (− f ) = S X ( f )
Average Power of a WSS RP:
RX ( 0 ) = E ⎡⎣ X 2 ( t ) ⎤⎦ Power Spectral Density of a Random
Sequence: The PSD for WSS random Sequence
Cross Correlation Functions (CCF): The cross Xn is
correlation of random processes X(t) and Y(t) is ⎡ L 2⎤
1
RXY ( t ,τ ) = E ⎡⎣ X ( t ) Y ( t + τ ) ⎤⎦ S X (φ ) = lim
L→∞
2L + 1
E⎢
⎢ ∑
X n e − j 2πφ n ⎥

⎣ n=− L ⎦
Jointly WSS Processes: The random processes
X(t) and Y(t) are jointly wide sense stationary if
X(t) and Y(t) are each wide sense stationary, and Cross Spectral Density:
the cross correlation satisfies ∞
RXY ( t ,τ ) = RXY (τ ) S XY ( f ) = ∫ R (τ ) e
XY
− j 2π f τ

−∞

Theorem: When a WSS RP X(t) is the input to a


Random Signal Processing linear time invariant filter with frequency
response H(f), PSD of the output Y(t) is
Theorem: If the input to a linear time invariant
SY ( f ) = H ( f ) S x ( f )
2
filter with impulse response h(t) is a WSS RP
X(t), the output Y(t) is a WSS RP with mean
value
∞ Theorem: Let X(t) be a wide sense stationary
μY = μ X ∫ h(t )dt =μ X H (0) , input to a linear time invariant filter H(f). The
−∞
and ACF function input X(t) and output Y(t) satisfy

∞ ∞
RY (τ ) = ∫ h(u ) ∫ h(v )RX (τ + u − v ) dvdu
S XY ( f ) = H ( f ) S X ( f )
SY ( f ) = H * ( f ) S XY ( f )
−∞ −∞

Power Spectral Density (PSD): For a WSS RP


X(t), the ACF RX (τ ) and PSD S X ( f ) are the
Fourier transform pair

6/15 Version 1 – January 2009


APPENDIX 1: Standard normal CDF

7/15 Version 1 – January 2009


APPENDIX 2: The standard normal complementary CDF

8/15 Version 1 – January 2009


APPENDIX 3: Moment Generating Function for families of random
variables

9/15 Version 1 – January 2009


APPENDIX 4: Fourier Transform Pairs

10/15 Version 1 – January 2009


APPENDIX 5: Mathematical Tables

Trigonometric Identities

11/15 Version 1 – January 2009


Trigonometric Identities

Approximation

12/15 Version 1 – January 2009


Indefinite Integrals

13/15 Version 1 – January 2009


Definite Integrals

14/15 Version 1 – January 2009


Sequences and Series

15/15 Version 1 – January 2009

You might also like