Information & Communication
Information & Communication
Probability Axioms →
denoted by P (A∣B)
P (A∩B )
P (A∣B) = P (B )
For a fixed event B, it can be verified that the conditional probabilities
P (A∣B)form a legitimate probability law that satisfies the three axioms
we have:
P (Ai )⋅P (B∣Ai )
P (Ai ∣B) =
P (B )
Bayes’ rule is used for inference. There are a number of “causes” that
may result in a certain “effect.” We observe the effect, and we wish to
infer the cause.
Independent Events:
When the occurrence of B provides no information and
does not alter the probability that A has occurred, i.e.,
P (A∣B) = P (A). In this case, we say that Ais
independent of B .
A random variable is called discrete if its range (the set of values that
it can take) is
finite or at most countably infinite.
probability mass of x, denoted pX (x) ,is the probability of the event {
Note: ∑x pX (x) = 1
pX (x) = {
p, if x = 1
1 − p, if x = 0
A Binomial PMF
k!
We define the expected value (also called the expectation or the mean)
of a random variable
Moment: We define the nth moment as E[X n ], the expected value of the
random variable X n . With this terminology, the 1st moment of X is just
the mean.
b
More generally, P (a ≤ X ≤ b) = ∫a fX (x) dx
a
For any single value a, we have P (X = a) = ∫a fX (x) dx = 0. For
fX (x) > 0for every x, and must also satisfy the normalization
∞
equation: ∫−∞ fX (x) dx = P (−∞ < X < ∞) = 1
Graphically, this means that the entire area under the graph of the PDF
must be equal to 1.
(x) = { 2
c, if a ≤ x ≤ b
has the form: fX , where cis a
x , if otherwise
constant.
This is similar to the discrete case except that the PMF is replaced by
the PDF, and summation is replaced by integration.
FX (x) = P (X ≤ x) = { xk≤x
∑ pX (k) , X: Discrete
Loosely speaking, the CDF “accumulates” probability “up to” the value
x.
Any random variable associated with a given probability model has a
Properties of CDF →
1. FX is monotonically nondecreasing, if x
≤ y , then
FX (x) ≤ FX (y).
2.
FX (x)tends to 0as x → −∞, and to 1as x → ∞.
3. If
X is discrete, then FX has a piecewise constant and
staircase-like form.
4. If
X is continuous, then FX has a continuously varying
form.
dFX
fX (x) =
dx
(x)
Entropy
Lemma: H(X) ≥ 0
The joint entropy: The joint entropy H(X, Y )of a pair of discrete
random variables (X, Y )with a joint distribution p(x, y)is defined as:
⋅p(y) )
)
n
H(X1 , X2 , … , Xn ) = ∑i=1 H (Xi ∣Xi−1 , ..., X1 )