Lecture5 PDF
Lecture5 PDF
Lecture 5:
Subspace Identification
The Deterministic case
2/31
min ky − F xk2W
x
3/31
min ky − F xk2W
x
4/31
5/31
+ ǫ(k, θ)
−
A(θ) B(θ) K(θ)
C(θ)D(θ) ŷ(k, θ)
\
1. Assume SGM(θ)
2. Derive the “optimal” predictor for
ŷ(k, θ).
3. Find the “best” estimate
θ̂N = argminJN (θ) with
1 X
N
with JN (θ) = ky(k) − ŷ(k, θ)k22
N k=1
6/31
70
ŷ(k, θ). 60
40
θ̂N = argminJN (θ) with
30
1 X
N 20
6/31
7/31
7/31
8/31
SGM:
(
x(k + 1) = Ax(k) + Bu(k)
x(k) ∈ Rn
y(k) = Cx(k) + Du(k)
9/31
SGM:
(
x(k + 1) = Ax(k) + Bu(k)
x(k) ∈ Rn
y(k) = Cx(k) + Du(k)
9/31
10/31
11/31
in R3 . What do we see?
12/31
13/31
Plot for k = 0, 1, 2, . . .
15/31
Plot for k = 0, 1, 2, . . .
y(0) y(1) · · · y(N − 2) C
h i
y(1) y(2) · · ·
y(N − 1) =
CA x(0) x(1) · · · x(N − 2)
y(2) y(3) · · · y(N ) CA2
15/31
16/31
rank(Y0,3,N −1 ) = 2 = n
16/31
17/31
18/31
18/31
20/31
20/31
20/31
20/31
21/31
22/31
22/31
26/31
26/31
27/31
27/31
27/31
28/31
Then: U0,s,N Π⊥
U0,s,N = 0.
28/31
Then: U0,s,N Π⊥
U0,s,N = 0.
⊥ ⊥
Y0,s,N ΠU0,s,N = (Os X0,N + Ts U0,s,N ) ΠU0,s,N
= Os X0,N Π⊥
U0,s,N
28/31
range(Y0,s,N Π⊥
U0,s,N ) ⊆ range(Os )
29/31
range(Y0,s,N Π⊥
U0,s,N ) ⊆ range(Os )
Lemma:
X0,N
If u(k) is such that rank = n + sm
U0,s,N
and (A, C) observable ⇒ rank Y0,s,N Π⊥
U0,s,N = n
range(Y0,s,N Π⊥
U0,s,N ) = range(Os ) ⇒ AT , CT
29/31
30/31
k−1
X
y(k) = CT AkT x(0) + (u(i)T ⊗ CT Ak−i−1
T ) vec(B T ) + (u(k) T
⊗ I)vec(D
i=0
h iT
= F (k, AT , CT )θ θ = x(0)T vec(BT )T vec(DT )T
Next lecture:
Lecture 6: Subspace Identification Step Inputs and Ambient
Excitation
Keep your eyes focussed on “Guide-
lines/Rules/Schedule sc4040: 2016-2017” for the
correct deadlines!
31/31