Logis
Logis
The logistic distribution is a continuous probability distribution that is often used in statistics and machine
learning. It is defined by two parameters: µ (the location parameter) and σ (the scale parameter).
The probability density function (PDF) of the logistic distribution is:
e−(x−µ)/σ
f (x; µ, σ) = 2
σ 1 + e−(x−µ)/σ
where:
• µ is the location parameter, which determines the center of the distribution.
• σ is the scale parameter, which controls the spread of the distribution.
The cumulative distribution function (CDF) of the logistic distribution is:
1
F (x; µ, σ) =
1 + e−(x−µ)/σ
This distribution is similar to the normal distribution but has heavier tails, which can make it useful for
modeling data with outliers or for binary classification problems in machine learning.
The range of x for the logistic distribution is (−∞, ∞). The logistic distribution is defined for all real numbers,
so x can take any value from negative infinity to positive infinity.
To derive the expectation (mean) and variance of the logistic distribution, we start with its probability density
function (PDF) and cumulative distribution function (CDF). For the logistic distribution with parameters µ
(location) and σ (scale), the PDF is:
e−(x−µ)/σ
f (x; µ, σ) = 2
σ 1 + e−(x−µ)/σ
1
F (x; µ, σ) =
1 + e−(x−µ)/σ
1. Expectation (Mean)
The expectation (mean) of the logistic distribution can be found directly from the PDF. However, to derive it
formally, let’s compute it using integration:
Z ∞
E[X] = xf (x; µ, σ) dx
−∞
∞
e−(x−µ)/σ
Z
E[X] = x 2 dx
−∞ σ 1 + e−(x−µ)/σ
1
∞
e−z
Z
E[X] = (µ + σz) σ dz
−∞ σ(1 + e−z )2
∞
e−z
Z
E[X] = (µ + σz) dz
−∞ (1 + e−z )2
∞ ∞
e−z ze−z
Z Z
E[X] = µ dz + σ dz
−∞ (1 + e−z )2 −∞ (1 + e−z )2
∞
e−z
Z
dz = 1
−∞ (1 + e−z )2
∞
ze−z
Z
dz = 0
−∞ (1 + e−z )2
Therefore:
E[X] = µ · 1 + σ · 0 = µ
2. Variance
To find the variance, we need E[X 2 ]:
Z ∞
E[X ] = 2
x2 f (x; µ, σ) dx
−∞
∞
e−(x−µ)/σ
Z
E[X ] =2
x2 2 dx
−∞ σ 1 + e−(x−µ)/σ
∞
e−z
Z
E[X 2 ] = (µ + σz)2 dz
−∞ (1 + e−z )2
Expanding (µ + σz)2 :
(µ + σz)2 = µ2 + 2µσz + σ 2 z 2
∞
e−z
Z
E[X 2 ] = µ2 + 2µσz + σ 2 z 2
dz
−∞ (1 + e−z )2
2
∞ ∞ ∞
e−z ze−z z 2 e−z
Z Z Z
E[X 2 ] = µ2 dz + 2µσ dz + σ 2 dz
−∞ (1 + e−z )2 −∞ (1 + e−z )2 −∞ (1 + e−z )2
∞
e−z
Z
dz = 1
−∞ (1 + e−z )2
∞
ze−z
Z
dz = 0
−∞ (1 + e−z )2
∞
z 2 e−z π2
Z
dz =
−∞ (1 + e )
−z 2 3
Thus:
π2
E[X 2 ] = µ2 + σ 2 ·
3
π2
Var(X) = µ2 + σ 2 · − µ2
3
π2
Var(X) = σ 2 ·
3
So, the expectation and variance of the logistic distribution are:
• Expectation (Mean): µ
2
• Variance: σ 2 · π3
To derive the cumulative distribution function (CDF) of the logistic distribution, we start with its probability
density function (PDF). The PDF of the logistic distribution with parameters µ (location) and σ (scale) is:
e−(x−µ)/σ
f (x; µ, σ) = 2
σ 1 + e−(x−µ)/σ
3
x
e−(t−µ)/σ
Z
F (x; µ, σ) = 2 dt
−∞ σ 1 + e−(t−µ)/σ
x−µ
e−z
Z σ
F (x; µ, σ) = dz
−∞ 1 + e−z
z
e−u
Z
du
−∞ 1 + e−u
e−u 1
=
1 + e−u 1 + eu
Thus:
z
1
Z
du
−∞ 1 + eu
1+ez 1+ez
z
1 1 dv
Z Z Z
dv
du = =
−∞ 1 + eu 1 vv−1 1 v(v − 1)
1 1 1
= −
v(v − 1) 1−v v
Integrate:
1+ez 1+ez
1 1 v−1
Z
− dv = ln
1 v−1 v v 1
(1 + ez ) − 1 1−1 ez
ln − ln = ln
1 + ez 1 1 + ez
ez ez
= ln = ln = z − ln (1 + ez )
1 + ez 1 + ez
Rewriting this:
4
x−µ
1 1
Z σ
du =
−∞ 1 + eu 1 + e−(x−µ)/σ
Therefore, the cumulative distribution function (CDF) of the logistic distribution is:
1
F (x; µ, σ) =
1+ e−(x−µ)/σ
This result shows that the logistic distribution has a CDF similar to the sigmoid function, which is often
used in logistic regression and neural networks.
To derive the moment-generating function (MGF) of the logistic distribution, we use its probability density
function (PDF). The MGF, MX (t), is defined as:
MX (t) = E[etX ]
where E denotes the expectation. For a random variable X with the logistic distribution, this is given by:
Z ∞
MX (t) = etx f (x; µ, σ) dx
−∞
e−(x−µ)/σ
f (x; µ, σ) = 2
σ 1 + e−(x−µ)/σ
∞
e−(x−µ)/σ
Z
MX (t) = etx 2 dx
−∞ σ 1 + e−(x−µ)/σ
∞
etx−(x−µ)/σ
Z
MX (t) = 2 dx
−∞ σ 1 + e−(x−µ)/σ
∞
etx−(x/σ−µ/σ)
Z
MX (t) = 2 dx
−∞ σ 1 + e−(x−µ)/σ
∞
ex(t−1/σ)+µ/σ
Z
MX (t) = 2 dx
−∞ σ 1 + e−(x−µ)/σ
∞
e(µ+σz)(t−1/σ)+µ/σ
Z
MX (t) = 2 σ dz
−∞ σ (1 + e−z )
∞
eµ(t−1/σ)+σz(t−1/σ)+µ/σ
Z
MX (t) = 2 dz
−∞ (1 + e−z )
5
Combine the exponents:
∞
eσz(t−1/σ)
Z
MX (t) = e µ(t−1/σ)+µ/σ
2 dz
−∞ (1 + e−z )
∞
eσz(t−1/σ)
Z
2 dz
−∞ (1 + e−z )
This integral can be recognized as a known result in the literature and evaluates to:
∞
eσz(t−1/σ)
Z
πσ
dz =
−∞ (1 + e−z )
2 sin(π(t − 1/σ))
Thus:
πσ
MX (t) = eµ(t−1/σ)+µ/σ ·
sin(π(t − 1/σ))
πσeµt
MX (t) =
sin(π(t − 1/σ))
where f (x; µ, σ) is the probability density function (PDF) of the logistic distribution:
e−(x−µ)/σ
f (x; µ, σ) = 2
σ 1 + e−(x−µ)/σ
∞
e−z
Z
E[X ] =
n
(µ + σz)n 2 σ dz
−∞ σ (1 + e−z )
∞
e−z
Z
E[X ] = n
(µ + σz)n dz
−∞ (1 + e−z )2
6
2. Expand (µ + σz)n :
Using the binomial expansion, expand (µ + σz)n :
n
X n
(µ + σz)n = µn−k (σz)k
k
k=0
n
n n−k k ∞ k e−z
X Z
E[X n ] = µ σ z dz
k −∞ (1 + e−z )2
k=0
∞
e−z
Z
zk dz
−∞ (1 + e−z )2
∞
e−z (k − 1)!π
Z
zk dz =
−∞ (1 + e )
−z 2
sin(π k2 )
The results can be derived from properties of the Gamma function and specific integrals involving the
Beta function.
4. Combine Results:
Combining the results, the n-th raw moment of the logistic distribution is:
n
X n (k − 1)!π
E[X n ] = µn−k σ k
k=0
k sin(π k2 )