6 Approx
6 Approx
Regularized approximation
Robust approximation
▶ Euclidean approximation ( ∥ · ∥ 2 )
– solution x★ = A† b
𝜙(u)
1
𝜙(u) = max{0, |u| − a} deadzone-linear
p=1
absolute value 𝜙(u) = |u|
0
−2 −1 0 1 2
10
p=2
square 𝜙(u) = u2
0
−2 −1 0 1 2
1.5
𝜙hub (u)
u2
|u| ≤ M 1
𝜙hub (u) =
M(2|u| − M) |u| > M
0.5
0
−1.5 −1 −0.5 0 0.5 1 1.5
u
20
10
f (t)
0
−10
−20
−10 −5 0 5 10
t
▶ least-norm problem:
minimize ∥x∥
subject to Ax = b,
with A ∈ Rm×n , m ≤ n, ∥ · ∥ is any norm
Regularized approximation
Robust approximation
▶ a bi-objective problem:
▶ estimation: linear measurement model y = Ax + v, with prior knowledge that ∥x∥ is small
▶ optimal design: small x is cheaper or more efficient, or the linear model y = Ax is only valid
for small x
▶ robust approximation: good approximation Ax ≈ b with small x is less sensitive to errors
in A than good approximation with large x
y(t)
u(t)
0
−5 −0.5
−10 −1
0 50 100 150 200 0 50 100 150 200
t t
4 1
2 0.5
y(t)
u(t)
0 0
−2 −0.5
−4 −1
0 50 100 150 200 0 50 100 150 200
t t
4 1
2 0.5
y(t)
u(t)
0 0
−2 −0.5
−4 −1
0 50 100 150 200 0 50 100 150 200
t t
Convex Optimization Boyd and Vandenberghe 6.14
Signal reconstruction
▶ bi-objective problem:
– x ∈ Rn is unknown signal
– xcor = x + v is (known) corrupted version of x, with additive noise v
– variable x̂ (reconstructed signal) is estimate of x
– 𝜙 : Rn → R is regularization function or smoothing objective
▶ examples:
– quadratic smoothing, 𝜙quad (x̂) = n−1 2
Í
i=1 (x̂i+1 − x̂i )
Í n−1
– total variation smoothing, 𝜙tv (x̂) = i=1 |x̂i+1 − x̂i |
0.5
x̂
0
x
0 −0.5
0 1000 2000 3000 4000
0.5
−0.5
0 1000 2000 3000 4000
x̂
0
−0.5
0.5
0 1000 2000 3000 4000
0.5
xcor
x̂
0
−0.5 −0.5
0 1000 2000 3000 4000 0 1000 2000 3000 4000
i i
x̂i
0
1
x
0 −2
0 500 1000 1500 2000
−1 2
−2
x̂i
0 500 1000 1500 2000 0
2 −2
0 500 1000 1500 2000
1 2
xcor
x̂i
0
−1
−2 −2
0 500 1000 1500 2000 0 500 1000 1500 2000
i i
three solutions on trade-off curve
original signal x and noisy signal xcor
∥ x̂ − xcor ∥ 2 versus 𝜙quad (x̂)
▶ quadratic smoothing smooths out noise and sharp transitions in signal
Convex Optimization Boyd and Vandenberghe 6.17
Total variation reconstruction
2
1 2
x
x̂
0
−1
−2
0 500 1000 1500 2000
−2 2
0 500 1000 1500 2000
x̂
0
2
1 −2
0 500 1000 1500 2000
xcor
2
0
x̂
−1 0
−2 −2
0 500 1000 1500 2000 0 500 1000 1500 2000
i i
three solutions on trade-off curve
original signal x and noisy signal xcor
∥ x̂ − xcor ∥ 2 versus 𝜙tv (x̂)
Regularized approximation
Robust approximation
▶ two approaches:
– stochastic: assume A is random, minimize E ∥Ax − b∥
– worst-case: set A of possible values of A, minimize supA∈ A ∥Ax − b∥
12
10
A(u) = A0 + uA1 , u ∈ [−1, 1] xnom
▶ xnom minimizes ∥A0 x − b∥ 22 8
r(u)
6
xstoch
with u uniform on [−1, 1] xwc
▶ xwc minimizes sup−1≤u≤1 ∥A(u)x − b∥ 22 4
2
plot shows r(u) = ∥A(u)x − b∥ 2 versus u
0
−2 −1 0 1 2
u
▶ A = Ā + U , U random, E U = 0, E U T U = P
▶ stochastic least-squares problem: minimize E ∥ ( Ā + U)x − b∥ 22
▶ explicit expression for objective:
0.2 xrls
frequency
0.15
0.1
xtik
0.05 xls
0
0 1 2 3 4 5
r(u)