Booklet For Exam
Booklet For Exam
Probability
Bayes' Theorem:
Definition
P[ A | B]P[ B]
Outcome: An outcome of an experiment is any P[ B | A] =
possible observation of that experiment. P[ A]
P[ A | Bi ]P[ Bi ]
Sample Space: The sample space of an P [ Bi | A] = m
experiment is the finest-grain, mutually
exclusive, collectively exhaustive set of all ∑ P[ A | Bi ]P[ Bi ]
i =1
possible outcomes.
Theorem:
Event: An event is a set of outcomes of an The number of k-permutations of n
experiment. distinguishable objects is
De Morgan's law: ( n )k = n ( n − 1)( n − 2 )L ( n − k + 1)
( A ∪ B)
c
= Ac ∩ B c n!
=
(n − k )!
Axioms of Probability:
A probability measure P[.] is a function that The number of k-combinations of n
maps events in the sample space to real numbers distinguishable objects is
such that ⎛ n⎞ n!
⎜⎜ ⎟⎟ =
⎝ k ⎠ k!(n − k )!
Axiom 1. For any event A, P A ≥ 0 .[ ]
[ ]
Axiom 2. P S = 1 .
Axiom 3. For any countable collection of A1, Discrete Random Variables
A2… of mutual exclusive events
P [ A1 ∪ A2 ∪ L] = P [ A1 ] + P [ A2 ] + L Expectation:
E[ X ] = μ X = ∑ xP ( x)
x∈S X
X
Theorem:
For any event A, and event space Theorem:
{B1 , B2 ,K , Bm } , Given a random variable X with PMF PX(x) and
the derived random variable Y=g(X), the
m
P [ A] = ∑ P [ A ∩ Bi ] . expected value of Y is
i =1 E[Y] = μY = ∑ g ( x) P
x∈SX
X ( x)
Conditional Probability:
P[ AB ] Variance:
P[ A | B] =
Var[ X ] = σ X = E ⎡( X − μ X ) ⎤
2
P[ B] 2
⎣ ⎦
Law of Total Probability:
If B1, B2, B3, …, Bm is an event space and P[Bi] Var[ X ] = E ⎡⎣ X 2 ⎤⎦ − μ X2
> 0 for i = 1, …, m. then
= E ⎡⎣ X 2 ⎤⎦ − ( E [ X ])
m 2
P[ A] = ∑ P[ A | Bi ]P[ Bi ]
i =1
Theorem:
Independent Events: If Y = X + b, Var[Y ] = Var[ X ].
Event A and B are independent if and only if
Theorem:
E[ X ] =
k +l
, Var[ X ] =
( l − k )( l − k + 2 )
Binomial RV 2 12
For α < 0,
⎧⎛ n ⎞ x
⎪⎜ ⎟ p (1 − p )
n− x
x = 0,1, 2,K , n ⎧α x e −α
PX ( x) = ⎨⎝ x ⎠ ⎪ x = 0,1, 2K
PX ( x) = ⎨ x !
⎪0
⎩ otherwise ⎪0 otherwise
⎩
E[X] = np Var[X] = np(1 - p)
E[ X ] = α , Var[ X ] = α
For 0<p≤1,
Misc:
∞
⎧⎪ p (1 − p ) x −1 1
PX ( x) = ⎨
x = 1, 2,K ∑ xq x −1
=
(1 − q )
2
x =1
⎪⎩0 otherwise
N −1
1−α N
1 1− p ∑ α =n
1−α
E[ X ] = Var[ X ] = n =0
p p2
Theorem: Erlang RV
For a random variable X resulting from an
experiment with event space B1,…,Bm For x > 0, and a positive integer n,
m
⎧ λ n x n −1e − λ x
E[X] = ∑ E[ X Bi ]P[ Bi ]
i =1
⎪
f X ( x) = ⎨ ( n − 1) !
x≥0
⎪0 otherwise
⎩
x1
Expectation: E[ X ] = μ , Var[ X ] = σ 2
∞
E [ X ] = ∫ xf X ( x )dx
−∞
PX ( x) = ∑ PX ,Y ( x, y )
2π z
y
Theorem: PY ( y ) = ∑ PX ,Y ( x, y )
Given random variable X and a constant a > 0, x
y
FY ( y ) = FX ( )
a The expectation of
g ( X , Y ) = g1 ( X , Y ) + L + g n ( X , Y ) is
E [ g ( X , Y ) ] = E [ g1 ( X , Y )] + L + E [ g n ( X , Y ) ]
Theorem:
Given random variable X, the PDF and CDF of
V = X + b are
fV (v) = f X (v − b) Expectation of the sum of 2 RVs:
E [ X + Y ] = E [ X ] + E [Y ]
FV (v) = FX (v − b)
Variance of the sum of 2 RVs:
Var[ X + Y ]
= Var [ X ] + Var [Y ] + 2 E ⎡⎣( X − μ X )(Y − μY ) ⎤⎦
Pairs of Random Variables
= Var [ X ] + Var [Y ] + 2Cov [ X , Y ]
Joint CDF:
FX ,Y ( x, y ) = P [ X ≤ x, Y ≤ y ] Covariance:
x y
Cov [ X , Y ] = E ⎡⎣( X − μ X )(Y − μY ) ⎤⎦
FX ,Y ( x, y ) = ∫ ∫ f X ,Y (u , v)dvdu
−∞ −∞
Correlation:
Joint PDF: rXY = E [ XY ]
∂ FX ,Y ( x, y )
2
Cov [ X , Y ] = E [ XY ] − μ X μY
f X ,Y ( x, y ) =
∂x∂y
Correlation Coefficient:
Marginal PDF: Cov [ X , Y ]
f X ( x) = ∫
∞
f X ,Y ( x, y )dy ρ XY =
−∞ Var[ X ]Var[Y ]
∞
fY ( y ) = ∫ f X ,Y ( x, y )dx
−∞
Sums of Random Variables
Independence:
f X ,Y ( x, y ) = f X ( x) fY ( y ) Wn = X 1 + L + X n
Variance of Wn:
n n n
Var [Wn ] = ∑ Var [ X i ] + 2∑ ∑ Cov ⎡⎣ X , X
i j
⎤⎦
i =1 i =1 j =i +1
independent:
∞
fW ( w) = ∫ f X ( x) fY ( w − x)dx Stochastic Process
−∞
∞
=∫ f X ( w − y ) fY ( y )dy The Expected Value of a Process:
−∞
μ X (t ) = E ⎡⎣ X ( t ) ⎤⎦
Moment Generating Function (MGF):
φ X ( s ) = E ⎡⎣e sX ⎤⎦ Poisson Process of rate λ:
⎧ ( λT )n e − λT
MGF satisfies: φ X ( s ) |s = 0 = 1 ⎪ n = 0,1, 2L
PN (T ) ( n ) = ⎨ n!
⎪0 otherwise
The MGF of Y = aX + b satisfies: ⎩
φY ( s ) |= e sbφ X (as )
Theorem:
The nth moment: For a Poisson process of rate λ, the inter-arrival
times X1, X2,… are an iid random sequence
d nφ X ( s )
E ⎡⎣ X n ⎤⎦ = |s = 0 with the exponential PDF
ds n ⎧λ e − λ x x≥0
f X ( x) = ⎨
Theorem: For X 1 ,L , X n a sequence of ⎩0 otherwise
independent RVs, the MGF of
W = X 1 + L + X n is Autocovariance:
φW ( s ) = φ X ( s )φ X ( s )Lφ X ( s ) C X ( t ,τ ) = Cov ⎡⎣ X ( t ) , X ( t + τ ) ⎤⎦
1 2 n
∫S [
( f )df = E X 2 (t ) = R X (0)]
RX (τ ) = RX ( −τ )
X
−∞
RX (τ ) ≤ RX ( 0 ) S X (− f ) = S X ( f )
Average Power of a WSS RP:
RX ( 0 ) = E ⎡⎣ X 2 ( t ) ⎤⎦ Power Spectral Density of a Random
Sequence: The PSD for WSS random Sequence
Cross Correlation Functions (CCF): The cross Xn is
correlation of random processes X(t) and Y(t) is ⎡ L 2⎤
1
RXY ( t ,τ ) = E ⎡⎣ X ( t ) Y ( t + τ ) ⎤⎦ S X (φ ) = lim
L→∞
2L + 1
E⎢
⎢ ∑
X n e − j 2πφ n ⎥
⎥
⎣ n=− L ⎦
Jointly WSS Processes: The random processes
X(t) and Y(t) are jointly wide sense stationary if
X(t) and Y(t) are each wide sense stationary, and Cross Spectral Density:
the cross correlation satisfies ∞
RXY ( t ,τ ) = RXY (τ ) S XY ( f ) = ∫ R (τ ) e
XY
− j 2π f τ
dτ
−∞
∞ ∞
RY (τ ) = ∫ h(u ) ∫ h(v )RX (τ + u − v ) dvdu
S XY ( f ) = H ( f ) S X ( f )
SY ( f ) = H * ( f ) S XY ( f )
−∞ −∞
Trigonometric Identities
Approximation