0% found this document useful (0 votes)
7 views3 pages

326 Formulas

This document contains formulas and concepts related to probability theory and random variables, including definitions of probability, random variables, expected value, variance, and various types of probability distributions. It also discusses functions of random variables, joint distributions, and important theorems such as the Central Limit Theorem. Additionally, it includes formulas for conditional probabilities, moment generating functions, and various inequalities relevant to probability and statistics.

Uploaded by

squooshy2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views3 pages

326 Formulas

This document contains formulas and concepts related to probability theory and random variables, including definitions of probability, random variables, expected value, variance, and various types of probability distributions. It also discusses functions of random variables, joint distributions, and important theorems such as the Central Limit Theorem. Additionally, it includes formulas for conditional probabilities, moment generating functions, and various inequalities relevant to probability and statistics.

Uploaded by

squooshy2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

ELEC 326 Formulas

Probability Theory • Sampling without replacement (Hypergeometric): N


out of M items have a desireable attribute, and we
• Probability of event A occuring in sample space Ω is sample n of them. Probability
 that we get x with the
P [A] = |A|
|Ω| .
M N −M
x n−x
attribute is: P [x = x] = N

• B ⊂ A means B is a subset of A. n

• Cumulative Distribution Function:


• The union A ∪ B is the set of elements in either A or Zx
B. Fx (x) = fx (α)dα
−∞
• The intersection A ∩ B is the set of elements in both
A and B. • Continuous RV : Fx (x) is a continuous function.

• The complement A or Ac is the set of all elements not • Discrete RV : Fx (x) is a staircase function.
in A.
• Mixed RV : Fx (x) is a continuous function with jumps.
• A exclusive-or B A ⊕ B is the set of elements in A or
B but not both.

• A and B are disjoint if A ∩ B = {} (the null set). Expected Value and Variance

• A partition of Ω is a set of events whose union equals Z∞


Ω and whose intersection equals {}. • Expected Value (Mean): E[x] = xfx (x)dx = µx
−∞
• De Morgan’s Law : (A ∪ B) = A ∩ B.
• Expected value is linear: E[ax + b] = aE[x] + b
• Axiom 1 : P [A] ≥ 0 (positive)
• nth moment of x about a: E[(x − a)n ]
• Axiom 2 : P [Ω] = 1 (unity)
• Variance: var(x) = σx2 = E[(x−µx )2 ] = E[x2 ]−E[x]2
• Axiom 3 : P [A∪B] = P [A]+P [B]−P [A∩B] (additive) • var(ax + b) = |a|2 var(x)
 
n σx2
• Number of subsets of size k from a set of size n is • Chebyshev Inequality: P [|x − µx | ≥ ] ≤
k 2
P [A ∩ B] E[x]
• Conditional probability: P [A|B] = • Markov Inequality: P [x ≥ a] ≤
P [B] a
• Chi-Squared Test: If Oi is an observed set of data and
P [B|A]P [A]
• Baye’s Theorem: P [A|B] = Ei is a set we are trying to fit it to, we define χ2 =
P [B] X (Oi − Ei )2
. If χ2 is small, it is a good fit; if it is
Ei
i
large, it is a bad fit.

Random Variables

• Random Variable x maps some outcome of Ω to a real


number: (ω ∈ Ω) →
7 (x ∈ <) Types of Random Variables

• Probability Mass Function: px (xi ) = P [x = xi ] • Degenerate: RV that takes on only one value.

• Binary/Bernoulli : RV that can only be 0 or 1.


• Probability Density
Z Function: A function fx (x) such
that P [x ∈ B] = fx (x)dx • Indicator of A: RV that is 1 for elements of the event
B
A, and 0 otherwise. (80)

1
• Uniform: RV that is constant over a range of values. Types of Functions
(105)
• Quantizer : y = Q(x) Rounds the value of x to the
eλ λk nearest integer.
• Poisson: px (k) = (84)
k!
• Monotonic: A function that either never increases or
• Geometric: px (k) = (1 − a)ak−1 never decreases.
 
n k • Differential Entropy of an RV: H(x) =
• Binomial : px (k) = p (1 − p)n−k
k Z∞
x E[−ln(fx (x))] = −ln(fx (x))fx (x)dx nits (231)
• Exponential/Laplacian: fx (x) = λ1 e− λ U (x) (107)
−∞
1 x
 x α−1
• Gamma: fx (x) = e− λ U (x) (171)
λΓ(α) λ
(x − µ)2
 
1
• Normal/Gaussian: fx (x) = √ exp − Multiple Random Variables
2πσ 2 2σ 2
(150) • Outcome ω is mapped to two-dimensional plane by two
random variables x(ω) and y(ω) to the point (x, y).
• t-Distribution: (see slide 155)
• Random Vector : n-dimensional vector of random
• Cauchy: (see slide 157)
variables, eg (x, y, z), mapping an outcome to n-
• Chi-Squared : (see slide 168) dimensional plane.

• Bivariate: Uniform inside the unit circle. (300) • Joint cdf : Fx,y (x, y) = P [x ≤ x, y ≤ y]

• Jointly Normal RV : (x, y) = N (µx , µy ; σx2 , σy2 , ρ) (303) • Marginal cdf : Fx (x) = Fx,y (x, +∞) and Fy (y) =
Fx,y (+∞, y).

• Joint pmf : px,y (xk , yl ) = P [x = xk , y = yl ] where xk


and yl are possible values of the RV’s.
Functions of Random Variables
+∞
X
• For y = g(x), if g(x) is invertible (monotone), then • Marginal pmf : px (xj ) = px,y (xj , yk )
fx (g −1 (y)) k=1
Fy (y) = Fx (g −1 (y)) and fy (y) = d
dx g(x)|x=g −1 (y) • Conditional pmf : pmf of x given y = yk is equal to
px,y (x, yk )
• DAD Method : (see slide 203) px|y (x|yk ) =
py (yk )
• Expected Value: For a function y = g(x), expected
Z∞ • Joint pdf : The jpdf
Z Z is a function fx,y (x, y) such that
value is: E[y] = g(x)fx (x)dx P [(x, y) ∈ A] = fx,y (α, β)dαdβ
−∞ (α,β)∈A

• Characteristic Function: Φx (ω) = E[ejωx ] = fx,y (x, y)


Z∞ • Conditional pdf : fx|y (x|y) =
fy (y)
fx (x)ejωx dx
−∞ • Conditional Expectation of y given x: E[y|x = x] =
Z∞
1 dn yfy|x (y|x)dy
• Moment Theorem: E[xn ] = Φx (ω)|ω=0
j n dω n −∞

• Moment Generating Function: Mx (s) = E[esx ] = • Covariance: cov(x, y) = E[xy] − E[x]E[y]


Z∞
fx (x)esx dx • Independence: x and y are independent if Fx,y (x, y) =
−∞ Fx (x)Fy (y) or equivalently fx,y (x, y) = fx (x)fy (y)

• If y = ax + b, then My (s) = ebs Mx (as) • If cov(x, y) = 0 then x and y are independent.

2
• The correlation of x and y is E[xy]. If it is zero, the • Central Limit Theorem: For n independent RV’s
RV’s are uncorrelated or orthogonal. (x1 ...xn ) with defined means µi and variances σi2 , de-
n
cov(x, y) 1 X xi − µi
• Correlation Coefficient: ρx,y = fine the random variable z = √ , the dis-
σx σy n σi
i=1
tribution of z is Gaussian with zero mean and variance:
1 1 2
lim fz (z) √ e− 2 z
n→∞ 2π

Functions of Two RV’s • Random Processes: (see slide 388)

• z = g(x, y)

• Fz (z) = P [g(x, y) ≤ z], differentiate to get fz (z)


Other Useful Formulae/Identities
Z∞ Z∞  
• Expected Value: E[z] = g(x, y)fx,y (x, y)dxdy n n!
• =
−∞ −∞ k k!(n − k)!
n  
• The mean/variance of a sum of RV’s is equal to the X n
• Binomial Theorem: (a + b)n = ak bn−k
sum of the means/variances. k
k=0
• A function of a bivariate
p RV can be represented as Z∞
z = g(r) where r = x2 + y 2 • Gamma Function: Γ(α) = xα−1 e−x dx
• z = max(x, y): Fz (z) = Fx,y (z, z). If x and y are 0

independent then Fz (z) = Fx (z)Fy (z) and if they • If α is a positive integer, Γ(α) = (α − 1)!
are jointly continuous, then fz (z) = fx (z)Fy (z) +
Fx (z)fy (z)

• w = min(x, y): Fw (w) = Fx (w) + Fy (w) − Fx,y (w, w).


If x and y are independent, then Fw (w) = Fx (w) +
Fy (w)−Fx (w)Fy (w) and if they are jointly continuous,
then fw (w) = fx (w)(1 − Fy (w)) + (1 − Fx (w))fy (w)

• Useful summary on slide 358.

Two Functions of Two RV’s

• w = g(x, y) and z = h(x, y)


δx δy δx δy
• The jpdf is fw,z (w, z) = fx,y (x, y) · where
δw δz δw δz
is the ratio of a small area around (x, y) to the corre-
sponding small area around the corresponding (w, z).
δx δy 1
• = where the Jacobian J(x, y) =
δw δz |J(x, y)|
∂g(x, y) ∂g(x, y)
∂(g, h) ∂x ∂y
= (see slide 375)
∂(x, y) ∂h(x, y) ∂h(x, y)
∂x ∂y
• p
For the transformation to polar coordinates r =
x2 + y2 ) and θ = arctan( yx ), the ratio is r.

• Gaussian Random Vector : (see slide 380)

You might also like