Bayesian Differential Privacy For Linear Dynamic System
Bayesian Differential Privacy For Linear Dynamic System
6, 2022
Abstract—Differential privacy is a privacy measure and methods of controller design with privacy protection in
based on the difficulty of discriminating between simi- mind have been studied [9]–[11].
lar input data. In differential privacy analysis, similar data Conventional differential privacy is a privacy measure based
usually implies that their distance does not exceed a pre-
determined threshold. It, consequently, does not take into on the difficulty of distinguishing similar data, where x and x
account the difficulty of distinguishing data sets that are far are regarded as being similar if |x − x | ≤ c for a prescribed
apart, which often contain highly private information. This c > 0. To put it conversely, there is no indistinguishability
problem has been pointed out in the research on differen- guarantee for x and x if |x−x | > c. This implies there is a risk
tial privacy for static data, and Bayesian differential privacy of information leakage when there are outliers from normal
has been proposed, which provides a privacy protection
level even for outlier data by utilizing the prior distribution data as pointed out in [12]. For example, unusual electricity
of the data. In this letter, we introduce this Bayesian differ- consumption patterns may contain highly private information
ential privacy to dynamical systems, and provide privacy about the lifestyle. In the literature [13], a new concept called
guarantees for distant input data pairs and reveal its fun- Bayesian Differential Privacy is developed for static data to
damental property. For example, we design a mechanism solve this problem. Bayesian differential privacy considers the
that satisfies the desired level of privacy protection, which
characterizes the trade-off between privacy and information underlying probability distribution of the data, and attempts to
utility. guarantee privacy for data sets that are far apart.
In this letter, we consider a prior distribution for the signal
Index Terms—Control system security, differential pri-
vacy, stochastic system. that we want to keep secret and introduce Bayesian differ-
ential privacy for linear dynamical systems. Similar to the
conventional differential privacy cases [9], we consider a
mechanism where stochastic noise is added to the output data.
I. I NTRODUCTION Note that applying a large noise increases the privacy pro-
tection level, but decreases the information usefulness [14].
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY CALICUT. Downloaded on January 03,2025 at 10:43:15 UTC from IEEE Xplore. Restrictions apply.
SUGIURA et al.: BAYESIAN DIFFERENTIAL PRIVACY FOR LINEAR DYNAMICAL SYSTEMS 897
To proceed with differential privacy analysis, we consider the Larger noise WT and lower threshold for the adjacency in
output yw (t) := y(t) + w(t) after adding the noise w(t) ∈ Rq ; the sense of and K, respectively, make the left-hand side
see Fig. 1. From (2), Yw,T ∈ R(T+1)q can be described by of (8) larger. This implies that differential privacy is ensured
for smaller ε and δ because R is a decreasing function. Further
Yw,T = NT UT + WT , (5) insight of K will be revealed in Corollary 1 below.
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY CALICUT. Downloaded on January 03,2025 at 10:43:15 UTC from IEEE Xplore. Restrictions apply.
898 IEEE CONTROL SYSTEMS LETTERS, VOL. 6, 2022
III. B AYESIAN D IFFERENTIAL P RIVACY In (17), the outer (resp. inner) P is taken with respect to
FOR DYNAMICAL S YSTEMS (UT , UT ) (resp. WT ). Roughly speaking, the definition of BDP
A. Formulation is that the probability that the mechanism satisfies (K, ε, δ)-DP
is greater than or equal to γ . Note that this definition places
In Definition 2, the difficulty of distinguishing data pairs
no direct restriction on the distance between a pair of input
whose K-weighted distance is larger than a threshold 1 is not
data UT , UT .
taken into account. Note that there is no design guideline for
K. In this section, we introduce Bayesian differential privacy B. Sufficient Condition for Noise Scale
for dynamical systems. To this end, we assume the following
It is desirable that the added noise w is small to retain
availability of the prior distribution of the data to be protected,
the data usefulness; see, e.g., Section V. The following theo-
and provide a privacy guarantee that takes into account the
rem gives a sufficient condition for noise scale to guarantee
discrimination difficulty of data pairs based on the prior.
(PUT , γ , ε, δ)-Bayesian differential privacy.
Assumption 1: The input data to the mechanism, UT , is
Theorem 2: Suppose that the prior distribution PUT of UT
an R(T+1)m -valued random variable with distribution PUT . In
is N(T+1)m (0, ). The Gaussian mechanism (5) induced by
addition, one can use the prior PUT to design a mechanism.
WT ∼ N(T+1)q (μw , w ) is (PUT , γ , ε, δ)-Bayesian differen-
The following is a typical example where a private input
tially private at a finite time T ∈ Z+ with 1 ≥ γ ≥ 0, ε > 0,
signal is a realization of a random variable.
and 1/2 > δ > 0, if the covariance matrix w 0 is chosen
Example 1: Suppose that the input data u(t) to be protected
such that
is the reference for tracking control; see [14]. In many applica-
−1/2
tions, tracking to the reference signal over specified frequency λmax (O,w ,T ) ≥ c(γ , T)R(ε, δ) (18)
ranges is required. Such a control objective can be represented
where O,w ,T is defined by
by filtering white noise ξ(t). To be more precise, we assume
u(t) = r(t) is generated by O,w ,T := 1/2 NT w−1 NT 1/2 (19)
xr (t + 1) = Ar xr (t) + Br ξ(t), xr (0) = 0, (10) and c(γ , T) is the unique c > 0 that satisfies
r(t) = Cr xr (t) + Dr ξ(t), (11) c2 /2
1
ξ(t) ∼ N (0, I), t ∈ {0, 1, . . . , T}, (12) γ = x(T+1)m/2−1 e−x/2 dx.
0 2 (T+1)m/2 ((T + 1)m/2)
where Ar ∈ Rl×l ,
Br ∈ Rl×m ,
Cr ∈ Rm×l ,
Dr ∈ Rm×m . (20)
The power spectrum of u(t) is characterized by the frequency
Proof: Using a similar argument as in the proof for [9, Th.
transfer function
2.6], for any fixed U, U ∈ R(T+1)m , one has
Gr (ejλ ) := Cr (ejλ I − Ar )−1 Br + Dr , λ ∈ [0, π ). (13)
P[NT UT + WT ∈ S | UT = U]
In this case,
≤ eε P[NT UT + WT ∈ S | UT = U ]
Dr ξ(0) (t = 0),
1
r(t) = t j−1
Dr ξ(t) + j=1 Cr Ar Br ξ(t − j) (t ≥ 1),
(14) + P Z ≥ εh − | UT = U, UT = U
2h
and the prior distribution is given by UT ∼ N (0, U ) with where
U := T
T, (15) h := |Y − Y |−1−1 , Y := NT UT , Y := NT UT ,
w
T := TT (Ar , Br , Cr , Dr ). (16)
and Z ∼ N (0, 1). Then, the mechanism is (PUT , γ , ε, δ)-
Note that the step reference signal whose value obeys r(t) ≡ 1
Bayesian differentially private, if Q(εh − 2h ) ≤ δ with
r̄ ∼ N (0, s ) can be modeled by setting Ar = Cr = I, Br = probability at least γ , i.e.,
Dr = 0 with xr (0) ∼ N (0, s ) in (10), (11). This corresponds
to the case where the initial state is the private information P[h ≥ R(ε, δ)] ≥ γ . (21)
rather than the input sequence UT as discussed in [9]. The inequality (21) holds if (18) is satisfied. This is because
Then, based on the Bayesian differential privacy for static
data [13], we define (PUT , γ , ε, δ)-Bayesian differential pri- h−2 = |NT (UT − UT )|2 −1
w
vacy, which is an extension of differential privacy for = −1/2
(UT − UT ) O,w ,T −1/2 (UT − UT )
dynamical systems. −1/2 2
≤ | (UT − UT )| λmax (O,w ,T ),
Definition 3 [(PUT , γ , ε, δ)-Bayesian Differential Privacy]:
Assume that the random variables UT , UT are independent and and then, from the fact that | −1/2 (UT − UT )|2 /2 follows
both follow the distribution PUT . Given 1 ≥ γ ≥ 0, ε > 0 χ 2 distribution with (T + 1)m degrees of freedom and the
and δ ≥ 0, the mechanism (5) is said to be (PUT , γ , ε, δ)- definition of c(γ , T), | −1/2 (UT − UT )|2 ≤ c(γ , T)2 with
Bayesian differentially private ((PUT , γ , ε, δ)-BDP) at a finite probability γ .
time instant T ∈ Z+ , if In order to clarify the connection between conventional
and Bayesian DP, it is worthwhile comparing Theorems 1
P P[NT UT + WT ∈ S | UT ]
and 2. Bayesian differential privacy with the prior distribu-
≤ eε P[NT UT + WT ∈ S | UT ] + δ ≥ γ , ∀S ∈ B (R(T+1)q ). tion PUT ∼ N(T+1)m (0, ) corresponds to differential privacy
(17) with an adjacency weight K.
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY CALICUT. Downloaded on January 03,2025 at 10:43:15 UTC from IEEE Xplore. Restrictions apply.
SUGIURA et al.: BAYESIAN DIFFERENTIAL PRIVACY FOR LINEAR DYNAMICAL SYSTEMS 899
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY CALICUT. Downloaded on January 03,2025 at 10:43:15 UTC from IEEE Xplore. Restrictions apply.
900 IEEE CONTROL SYSTEMS LETTERS, VOL. 6, 2022
Fig. 3. Mechanism with input noise. Fig. 4. Feedback system with input noise mechanism.
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY CALICUT. Downloaded on January 03,2025 at 10:43:15 UTC from IEEE Xplore. Restrictions apply.
SUGIURA et al.: BAYESIAN DIFFERENTIAL PRIVACY FOR LINEAR DYNAMICAL SYSTEMS 901
Fig. 6. Reference signal r (t) and plant output yp (t) for three mecha- VI. C ONCLUSION
nisms. In this letter, we introduced Bayesian differential privacy for
linear dynamical systems using prior distributions of input data
to provide privacy guarantees even for input data pairs with
In what follows we compare the following three cases: large differences, and gave sufficient conditions to achieve it.
• noise-free,
2 I) with
Furthermore, we derived the minimum energy Gaussian noise
• i.i.d. noise: VT ∼ N(T+1)m (0, σiid
that satisfies the condition. As noticed in Section III-C, any
2
finite noise cannot guarantee the Bayesian differential privacy
Tr(σiid I) = c(γ , T)2 R(ε, δ)2 λmax (U )(T + 1)m for the infinite horizon case. This issue will be addressed in
= 121.604, (35) future work.
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY CALICUT. Downloaded on January 03,2025 at 10:43:15 UTC from IEEE Xplore. Restrictions apply.