0% found this document useful (0 votes)
24 views13 pages

Joint NDT Image Restoration and Segmentation Using GaussMarkovPotts Prior Models and Variational Bayesian Computation

This paper presents a method for joint image restoration and segmentation using nonhomogeneous Gauss-Markov-Potts prior models within a Bayesian framework. The authors approximate the joint posterior law of the unknowns through variational Bayesian computation, allowing for practical implementation in nondestructive testing applications. Preliminary results indicate the effectiveness of this approach compared to traditional MCMC Gibbs sampling methods.

Uploaded by

likeit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views13 pages

Joint NDT Image Restoration and Segmentation Using GaussMarkovPotts Prior Models and Variational Bayesian Computation

This paper presents a method for joint image restoration and segmentation using nonhomogeneous Gauss-Markov-Potts prior models within a Bayesian framework. The authors approximate the joint posterior law of the unknowns through variational Bayesian computation, allowing for practical implementation in nondestructive testing applications. Preliminary results indicate the effectiveness of this approach compared to traditional MCMC Gibbs sampling methods.

Uploaded by

likeit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 19, NO.

9, SEPTEMBER 2010 2265

Joint NDT Image Restoration and Segmentation


Using Gauss–Markov–Potts Prior Models and
Variational Bayesian Computation
Hacheme Ayasso, Student Member, IEEE, and Ali Mohammad-Djafari, Member, IEEE

Abstract—In this paper, we propose a method to simultaneously where hyperparameters , , called the
restore and to segment piecewise homogeneous images degraded likelihood, is obtained using the forward model (1) and the as-
by a known point spread function (PSF) and additive noise. For this signed probability law of the errors, is the as-
purpose, we propose a family of nonhomogeneous Gauss–Markov
fields with Potts region labels model for images to be used in a signed prior law for the unknown image and
Bayesian estimation framework. The joint posterior law of all the
unknowns (the unknown image, its segmentation (hidden variable) (3)
and all the hyperparameters) is approximated by a separable prob-
ability law via the variational Bayes technique. This approxima-
tion gives the possibility to obtain practically implemented joint is the evidence of the model . Assigning Gaussian priors
restoration and segmentation algorithm. We will present some pre-
liminary results and comparison with a MCMC Gibbs sampling (4a)
based algorithm. We may note that the prior models proposed in
this work are particularly appropriate for the images of the scenes
or objects that are composed of a finite set of homogeneous mate- (4b)
rials. This is the case of many images obtained in nondestructive
testing (NDT) applications.
It is easy to show that the posterior law is also Gaussian
Index Terms—Bayesian estimation, image restoration, segmen-
tation, variational Bayes approximation.

(5a)
I. INTRODUCTION
with

A simple direct model of image restoration problem is given


by
and
(5b)

(1) (5c)

where is the observed image, is a known point spread which can also be obtained as the solution that minimizes
function, is the unknown image, and is the measure-
ment error, and equivalently, , and are vectors containing (6)
samples of , , and , respectively, and is a huge
matrix whose elements are determined using samples. where we can see the link with the classical regularization
is the whole space of the image surface. theory [1].
In a Bayesian framework for such an inverse problem, one For more general cases, using the MAP estimate
starts by writing the expression of the posterior law
(7)
(2)
we have

Manuscript received February 27, 2009; October 12, 2009; accepted March (8)
09, 2010. First published April 08, 2010; current version published August 18,
2010. The associate editor coordinating the review of this manuscript and ap-
proving it for publication was Dr. Eero P. Simoncelli. where and . Two families of
The authors are with Laboratoire des Signaux et Systèmes, Unité mixte de priors could be distinguished
recherche 8506 (Univ Paris-Sud — CNRS — SUPELEC), Supélec, Plateau de
Moulon,91192 Gif-sur-Yvette, France (e-mail: [email protected]).
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/TIP.2010.2047902

1057-7149/$26.00 © 2010 IEEE


Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
2266 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 19, NO. 9, SEPTEMBER 2010

and based upon total variation (TV) and products of Student’s-t


priors for image restoration were used in [23] and [24], re-
spectively. For blind image deconvolution using nonsmooth
prior the variational approximation was used in [25] where a
TV-based prior was used for the image and a Gaussian for the
point-spread function (PSF) and in [26] where a Student’s-t
where different expressions have been used for the potential prior was used for the image and a kernel based Student’s-t
function , [2]–[4] with great success in many applications. prior for the PSF.
Still, this family of priors cannot give a precise model for The rest of this paper is organized as follows. In Section II,
the unknown image in many applications, due to the assump- we give more details about the proposed prior models. In
tion of global homogeneity of the image. For this reason, we Section III, we employ these priors using the Bayesian frame-
have chosen in this paper to use a nonhomogeneous prior model work to obtain a joint posterior law of the unknowns (image
that takes into account the existence of contours in most the im- pixels, hidden variable, and the hyperparameters including the
ages. In particular, we aim to simultaneously obtain a restored region statistical parameters and the noise variance). Then in
image and its segmentation, which means that we are inter- Section IV, we use the variational Bayes approximation in
ested in images composed of finite number of homogeneous order to obtain a tractable approximation of joint posterior
regions. This implies the introduction of the hidden variable law. In Section V, we show some image restoration examples.
which associates each pixel with a Finally, in Section VI we provide our conclusion for this work.
label (class) , where is the number of
classes and represents the whole space of the image surface. II. PROPOSED GAUSS–MARKOV–POTTS PRIOR MODELS
All pixels with the same label share some proper- As presented in the previous section, the main assumption
ties, for example the mean gray level, the mean variance, and used here is the piecewise homogeneity of the restored image.
the same correlation structure. Indeed, we use a Potts-Markov This model corresponds to a number of applications where the
model for the hidden label variance to model the spatial studied data are obtained by imaging objects composed of a
structure of the regions. As we will see later, the parameters of finite number of materials. This is the case of medical imaging
these models can control the mean size of the regions in the (muscle and bone or grey-white materials). Indeed in nonde-
image. Even if we assume that the pixels inside a region are mu- structive testing (NDT) imaging for industrial applications,
tually independent of those of other regions, for the pixels inside studied materials are, in general, composed of air-metal or
a given region we propose two models: independent or Mar- air-metal-composite. This prior model have already been used
kovian, i.e. the image is modeled as a mixture of independent in several works for several application [27]–[30].
Gaussians or a mixture of multivariate (Gauss–Markov). How- In fact, this assumption permits to associate a label (class)
ever, this choice of prior makes it impossible to get an analyt- to each pixel of the image . The set of these labels
ical expression for the maximum a posterior (MAP) or posterior form a color image, where corresponds
mean (PM) estimators. Consequently, we will use the variational to the number of materials, and represents the entire image
Bayes technique to calculate an approximate form of this law. pixel area. This discrete value hidden variables field represents
The problem of image deconvolution in general and in a the segmentation of the image.
Bayesian framework has been widely discussed in [2], [3], [5], Moreover, all pixels which have the
[6]. We present here the main contributions to this problem same label , share the same probabilistic parameters (class
knowing that this list is far from being exhaustive. For example, means , and class variances ), . Indeed, these
from the point of view of prior choice, [7] used a Gaussian prior pixels have a similar spatial structure while we assume here that
to restore the image. More sophisticated prior was proposed pixels from different classes are a priori independent, which
in [8] and [9] by means of Markov random fields. The choice is natural since they image different materials. This will be a
of non quadratic potentials was studied by [10]–[12]. In a key hypothesis when introducing Gauss–Markov prior model
multiresolution context, we take the example of [13] and [14] of source later in this section.
where several priors were employed in the wavelet domain. Consequently, we can give the prior probability law of a pixel,
From the posterior approximation point of view, the Varia- given the class it belongs to, as a Gaussian (homogeneity inside
tional Bayes technique (or ensemble learning) was first intro- the same class)
duced for neural networks application [15], [16]. Then it was
applied to graphical model learning in [17] where several priors (9)
were studied. In [18], studied model parameter estimation in a
This will give a Mixture of Gaussians (MoG) model for the pixel
Variational Bayes context with a Gaussian prior over these pa-
, which can be written as follows:
rameters was studied. However, more work related to this sub-
ject can be found in Section IV-B and [19].
The Variational Bayes technique was introduced for image
recovery problems in [20]. Since then it has found a number (10)
of applications in this field. Smooth Gaussian priors where im- Modeling the spatial interactions between different elements
plemented for blind image deconvolution in [21]. An extension of the prior model is an important issue. This study is con-
with a hierarchical model was proposed in [22]. Nonsmooth cerned with two interactions, pixels of images within the same
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
AYASSO AND MOHAMMAD-DJAFARI: JOINT NDT IMAGE RESTORATION AND SEGMENTATION 2267

Fig. 1. Proposed a priori model for the images: the image pixels f (r ) are
assumed to be classified in K classes, z (r ) represents those classes (segmen-
tation). In MIG prior (a), we assume the image pixels in each class to be inde-
pendent while in MGM prior (b), image pixels these are considered dependent.
In both cases, the hidden field values follows Potts model.

Fig. 2. Hierarchical prior model where our variable of interest f can be mod-
class and elements of the hidden vari- eled by a mixture of Gaussians (MIG) or mixture of Gauss–Markov (MGM)
with mean values m and variances v . Hidden field z prior follows an external
ables . In this paper, we assign Potts model field Potts model with  and as hyperparameters. Meanwhile, the error prior is
for the hidden field in order to obtain more homogeneous supposed Gaussian with unknown variance  . Conjugate priors were chosen
classes in the image. Meanwhile, we present two models for the for m; v ;  ;  , while is chosen as a fixed value.
image pixels ; the first is independent, while the second is a
Gauss–Markov model. In the following, we give the prior prob-
ability of the image pixels and the hidden field elements for the with
two models.

A. Mixture of Independent Gaussians (MIG) if


if (15)
In this case, no prior dependence is assumed for the elements
of given
(16)
(11a)
(11b)
We may remark that is a non homogeneous
Gauss–Markov field because the means are functions of
with , , and the pixel position . As a by product, note that represents
the contours of the image if and
elsewhere.
For both cases, a Potts Markov model will be used to describe
the hidden field prior law for both image models

(12)
(17)
where is the energy of singleton cliques, and is Potts
B. Mixture of Gauss–Markovs (MGM) constant. The hyperparameters of the model are class means
, variances , and finally singleton clique energy
In the MIG model, the pixels of the image in different regions . The graphical model of the observation generation mech-
are assumed independent. Furthermore, all the pixels inside a anism assumed here is given in Fig. 2.
region are also assumed conditionally independent. Here, we
relax this last assumption by considering the pixels in a region
Markovian with the four nearest neighbors III. BAYESIAN RECONSTRUCTION AND SEGMENTATION

So far, we have presented two prior models for the unknown


image based upon the assumption that the object is composed of
(13) a known number of materials. That led us to the introduction of
a hidden field, which assigns each pixel to a label corresponding
to its material. Thus, each material can be characterized by the
(14) statistical properties . Now in order to estimate the
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
2268 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 19, NO. 9, SEPTEMBER 2010

unknown image and its hidden field, we use the joint posterior can not be obtained in an analytical form. Therefore, we ex-
law plore two approaches to solve this problem. The first is the
Monte Carlo technique and the second is Variational Bayes
approximation.
(18)
A. Numerical Exploration and Integration via Monte Carlo
This requires the knowledge of , and which Techniques
we have already provided in the previous section, and the model
This method solves the previous problem by generating a
likelihood which depends on the error model. Clas-
great number of samples representing the posterior law and then
sically, it is chosen as a zero mean Gaussian with variance ,
calculating the desired estimators numerically from these sam-
which is given by
ples. The main difficulty comes from the generation of these
samples. Markov Chain Monte Carlo (MCMC) samplers are
(19)
used generally in this domain and they are of great interest be-
cause they explore the entire space of the probability density.
In fact the previous calculation assumes that the hyperparame-
The major drawback of this nonparametric approach is the com-
ters values are known, which is not the case in many practical
putational cost. A great number of iterations are needed to reach
applications. Consequently, these parameters have to be esti-
the convergence; also many samples are required to obtain good
mated jointly with the unknown image. This is possible using
estimates of the parameters.
the Bayesian framework. We need to assign a prior model for
To apply this method to our problem, we use a Gibbs sampler.
each hyperparameter and write the joint posterior law
The basic idea in this approach is to generate samples from the
posterior law (20) using the following general algorithm:

(20)
(22a)
where groups all the unknown hyperparame- (22b)
ters, which are the means , the variances , the singleton en- and
ergy , and error inverse variance . While, the Potts constant
(22c)
is chosen to be fixed due to the difficulty of finding a conju-
gate prior to it. We choose an Inverse Gamma for the model of
We have the expressions for all the necessary probability laws in
the error variance , a Gaussian for the means , an Inverse
the right hand side of the previously mentioned three conditional
Gamma for the variances , and finally a Dirichlet for
laws to be able to sample from them. Indeed, it is easy to show
that the first one is a Gaussian which is then
(21a)
easy to handle. The second is a Potts field where
(21b) many fast methods exist to generate samples from it [31]. The
(21c) last one is also separable in its components, and
(21d) due to the conjugate property, it is easy to see that the posterior
laws are either Inverse Gamma, Inverse Wishart, Gaussian, and
where , , , , , and are fixed for a given Dirichlet for which there are standard sampling schemes [28],
problem. The previous choice of conjugate priors is very [32].
helpful for the calculation that follows in the next section.
B. Variational or Separable Approximation Techniques
IV. BAYESIAN COMPUTATION One of the main difficulties to obtain an analytical expression
In the previous section, we found the necessary ingredients for the estimator is the posterior dependence between the un-
to obtain the expression of the joint posterior law. However, known parameters. For this reason, we propose, for this method,
calculating the joint maximum posterior (JMAP) a separable form of the joint posterior law, and then we try to
or the posterior means (PM) find the closest posterior to the original posterior under this con-
straint. The idea of approximating a joint probability law
by a separable law is not new [33]–[36]. The
selection of the parametric families of for which the com-
putations can be easily done has been addressed recently for data
mining and classification problems [37]–[44], and [40]. How-
ever, their use for Bayesian computations for the inverse prob-
lems in general and for image restoration in particular, using this
and class of prior models, is one of the contributions.
We consider the problem of approximating a joint pdf
by a separable one . The first
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
AYASSO AND MOHAMMAD-DJAFARI: JOINT NDT IMAGE RESTORATION AND SEGMENTATION 2269

step for this approximation is to choose a criterion. A natural where the shaping parameters of these laws are mutually de-
criterion is the Kullback–Leibler divergence pendent. So, an iterative method should be applied to obtain the
optimal values. In the following, we will give the expression of
each shaping parameter for the iteration as function of the pre-
vious iteration .
We can unify both priors by means of contour variable
which is set to 1 in the MIG prior and the value defined in (16)
(23) in the MGM case.
We start by the conditional posterior of the image
where is the expectation of w.r.t . So, the main mathe- in (26a) where and are given by the following
matical problem is finding which minimizes . relations:
We first make two points:
1) the optimal solution without any constraint is the trivial
solution ;
2) the optimal solution with the constraint
where is a given constant is the one which maximizes
the entropy and is given by using the properties of the (27a)
exponential family. This functional optimization problem
can be solved and the general solution is
MIG
(24) (27b)
MGM
where and are the normalizing fac- MIG case
MGM case (27c)
tors [16], [45].
However, we may note that, first the expression of de-
pends on the expressions of , . Thus, this compu- (27d)
tation can be done only in an iterative way. The second point
is that in order to compute these solutions we must compute (27e)
. The only families for which these computa-
tions are easily done are the conjugate exponential families. At (27f)
this point, we see the importance of our choice of priors in the
(27g)
previous section.
The first step is to choose a separable form that is appro-
priate for our problem. In fact there is no rule for choosing the (27h)
appropriate separation; nevertheless, this choice must conserve
the strong dependences between variables and break the weak The expression for , , , and are given later in (29).
ones, keeping in mind the computation complexity of the poste- The expression for , and of the posterior law of the
rior law. In this work, we propose a strongly separated posterior, hidden field in (26c) are given by the following relations:
where only dependence between image pixels and hidden fields
is conserved. This posterior is given by

(25)

Applying the approximated posterior expression (24) on


(28a)
, we see that the optimal solution for
has the following form:

(26a)

(26b)

(26c)
(26d) (28b)
(26e)
(26f)
(28c)
(26g)
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
2270 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 19, NO. 9, SEPTEMBER 2010

TABLE I
HYPER-HYPERPARAMETERS VALUES

TABLE II
MODEL GENERATED IMAGE PROPERTIES MIG

Finally, the hyperparameters posterior parameters in (26) are

Fig. 3. Restoration results from MIG model: (a) original image, (b) distorted
image, (c) original segmentation, (d) MIG segmentation, (e) VB MIG recon-
(29a) struction, and (f) VB MGM reconstruction

(29b)

(29c)

(29d)

(29e)

(29f)

(29g)

Several observations can be made for these results. The most Fig. 4. Restoration results from MGM model: (a) original image, (b) distorted
image, (c) original segmentation, (d) MGM segmentation, (e) VB MIG restora-
important is that the problem of probability law optimization tion, and (f) VB MGM restoration.
turned into simple parametric computation, which reduces sig-
nificantly the computational burden. Indeed, although for our
choice of a strong separation, posterior mean value dependence V. NUMERICAL EXPERIMENT RESULTS AND DISCUSSION
between image pixels and hidden field elements is present in the
equations, which justifies the use of spatially dependent prior In this section, we show several restoration results using our
model with this independent approximated posterior. On the method. We start first by defining the values of the hyper-hy-
other hand, the iterative nature of the solution requires a choice perparameter that were used during the different experiments.
of a stopping criterion. We have chosen to use the variation of Then, we apply the proposed methods on a synthesized restora-
the negative free energy tion problem for images generated from our model. Afterwards,
the method is applied on real images. Finally, we compare the
(30) performance of our method to some other ones and especially to
a restoration method based upon the same prior model but with
to decided the convergence of the variables. This seems nat- MCMC estimator.
ural since it can be expressed as the difference between Kull- We choose the value of hyper-hyperparameters in a way that
back–Leibler divergence and the log-evidence of the model our priors stay as noninformative as possible. However, for the
Potts parameter we fixed the value that worked the best for us.
(31)
A. Model Generated Image Restoration
We can find the expression of the free energy using the shaping
parameters calculated previously with almost no extra cost. Fur- We start the test of our method by applying it on two simple
thermore, its value can be used as a criterion for model selec- images generated by our prior models (MIG and MGM). Then,
tion. We will present in the next section some restoration results a box triangle convolution kernel is applied and white Gaussian
using our method. noise is added. The chosen values are given in Table II.
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
AYASSO AND MOHAMMAD-DJAFARI: JOINT NDT IMAGE RESTORATION AND SEGMENTATION 2271

Fig. 5. Comparison between different histograms of the test images: (a) MIG, (b) VB MIG restoration for MIG image, (c) VB MIG restoration for MGM, (d) MGM,
(e) VB MGM restoration for MIG image, and (f) VB MGM restoration for MGM image.

TABLE V
TABLE III TEXT RESTORATION EXPERIMENT CONDITIONS
MIG IMAGE RESULTS SUMMARY

TABLE IV
MGM IMAGE RESULTS SUMMARY

We can see from Fig. 3 that our method was able to restore the
image with a small error (results details are available in Table III.
However, we can see that the quality of construction of VB MIG
is better than VB MGM, as expected, since in the MIG, pixels
in the same class are modeled as Gaussian, while in the MGM
the Gaussian property is imposed on the derivative (Markovian
Fig. 6. Text restoration results: (a) original image, (b) distorted image,
property). Similar results are found in the case of the MGM (c) original segmentation, (d) initial segmentation, (e) VB MIG segmentation,
model generated image (see Table IV), the VB MGM method and (f) VB MIG restoration.
has a better restoration performance than the VB MIG method
since it is more adaptive.
We can see this property more clearly by comparing the his- B. Testing Against “Real” Images
togram of each of the images. From Fig. 5, the histograms of the Herein, we show that our algorithm does not only work for
MIG and the VB MIG restoration are very similar. The same ob- images generated from the prior model but it works also for
servation can be made for the MGM and the VB MGM restored images resulting from several real applications. We start with
images. a text restoration problem (see Table V). We apply the VB
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
2272 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 19, NO. 9, SEPTEMBER 2010

Fig. 7. Text restoration hyperparameters evolution vs iteration: (a) error precision ~ , (b) classes mean m
~ , z = 1,2,3 (c), precision of class mean ~ , z = 1,2,3,
(d) class variance parameter 1 ~
b, (e) class variance parameter 2 c~, (f) Singleton energy parameter ~ , z = 1,2,3, (g) Negative free energy, and (h) RMS of the error.

MIG restoration method but we show how it works with higher


number of classes as prior information so we set the number of
classes to three. As we can see from Fig. 6, the results contains
only two classes (the background and the text). Although no
direct estimation of optimal number of classes is implemented
in the method, it is able to eliminate the extra class through
the segmentation process, where pixels are classified with the
dominating classes, while the eliminated class parameters are
Fig. 8. Goofy restoration: (a) original image, (b) distorted image L = 12:5%,
set to their prior values. Moreover, we are interested in the and (c) MIG restoration L = 7:8%.
evolution of hyperparameters during the iterations (Fig. 7). We
notice that almost all the variables reach their final value in 10
iterations. However, convergence is not achieved before itera-
tion 25, this corresponds to the elimination of the extra class
. All its hyperparameters take their prior values, and
the negative free energy makes a step change toward its final
value. In fact, this is very interesting since the log-evidence of
the model can be approximated by the negative free energy after
convergence. A higher value for the negative energy means
a better fit of the model, which is the case with two classes Fig. 9. Brain MRI restoration: (a) original image, (b) distorted image, and
(c) VB MIG restoration.
instead of three. Nevertheless, the estimation of number of
classes seems indispensable for other cases. Running a number
of restorations with different values and comparing the value of major drawback is the restoration of textured areas. Instead of
the negative energy can achieve a first remedy. the continuously gradient sky the MIG output split it into two
Moreover, we have studied the performance of our method almost constant classes, the same thing happened for the grass.
with images where our prior models do not correspond exactly This is normal because of the homogeneous class prior. For the
(Figs. 8 and 9). Hopefully, our method still gives good results, MIG model, these problems were less pronounced. However,
though several flaws can be remarked. For example in the brain the Gaussian property, which is acceptable for the sky, is not
image, the grey material is more constant than the original valid for the texture of the grass.
image, because of the under estimation of the class variance.
For Goofy image the background has over estimated variance. C. Comparison Against Other Restoration Methods
To test the limits of our prior model, we have tested our We present in the following a comparison between other
method with a “classical” image of the image processing litera- restoration methods and the proposed one. The test is based
ture (the Cameraman, Fig. 10). As we can see, the method was upon two aspects: the quality of the restored image, and the
able to restore the body of the camera man finely, notably his computational time compared to an MCMC based algorithm
eye which disappeared in the distorted version. However, the with the same prior model. For the quality of restoration we
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
AYASSO AND MOHAMMAD-DJAFARI: JOINT NDT IMAGE RESTORATION AND SEGMENTATION 2273

TABLE VI
NORMALIZED L DISTANCE FOR DIFFERENT METHODS. THE METHODS
ARE: MATCHED FILTER (MF), WIENNER FILTER (WF), LEAST SQUARE (LS),
MCMC FOR A MIG PRIOR (MCMC), AND FINALLY OUR
TWO METHODS VB MIG (MIG), AND VB MGM (MGM)

Fig. 10. Cameraman restoration: (a) original image, (b) distorted image,
(c) VB MIG restoration, and (d) VB MGM restoration.

MAP or PM estimators. Therefore, we proposed a Variational


Bayes approximation method. This method was applied to sev-
eral restoration problems, where it gave promising results.
Still, a number of the aspects regarding this method have to
be studied, including the convergence conditions, the quality of
estimation of classes and error variances, choice of separation
and the estimation of the Potts parameter.
APPENDIX A
PROBABILITY DISTRIBUTIONS
We recall herein the definition of the main probability distri-
butions used in this article to avoid any ambiguity.

A. Gaussian
Fig. 11. Comparison between different restoration methods: (a) matched filter
(MF), (b) Wienner filtering (WF), (c) least square (LS), (d) MCMC MIG, (e) VB Let be a random variable with Gaussian distribution with
MIG, and (f) VB MGM.
mean and variance . Then its distribution is given as

use 1 distance between the original image and the restored


one (Table VI). The methods are: matched filter (MF), Wi- (32)
enner filter (WF), least square (LS), MCMC for a MIG prior
(MCMC), and finally our two methods VB MIG (MIG), and
VB MGM (MGM). We can see that our algorithm has good B. Gamma and Inverse Gamma
restoration performance in comparison to these methods.
For the computation time, the method is compared to a Let be a random variable with a gamma distribution with
MCMC based estimator using the same prior. The tests are scale parameter , and shape parameter . Then we have
performed on a Intel Dual Core 2.66 GHz processor based
machine with both algorithm coded in Matlab. The time was
against . Moreover, for higher
dimensions MCMC like algorithms need more storage space (33)
for the samples, for example images
with samples MCMC will need storage size of with
pixels. While Variational Bayes
based ones require the storage of the shaping parameters for
the posterior laws in the current and previous iteration.

In similar way, we define inverse gamma probability distribu-


VI. CONCLUSION tion as follows:

We considered the problem of joint restoration and segmen-


tation of images degraded by a known PSF and by Gaussian
noise. To perform joint restoration and segmentation we pro- (34)
posed a Gauss–Markov–Potts prior model. More precisely, two
priors, independent Gaussian and Gauss–Markov models, were with
studied with the Potts prior on the hidden field. The expression
of the joint posterior law of all the unknowns (image, hidden
field, hyperparameters) is complex and it is difficult to compute
1L are more adapted for piecewise homogeneous images, since difference
image fits better in a double exponential distribution.
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
2274 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 19, NO. 9, SEPTEMBER 2010

C. Multivariate Normal Distribution


Let the vector follow a multivariate
normal distribution with expected vector and covariance ma-
trix . Then its probability density is given by (39b)

(39c)
(35)

If a random matrix follow a Wishart distribution with


degrees of freedom and a precision matrix , its
probability density is given by
(40a)

(40b)
(36)

With

(37) (40c)

(40d)
and

(40e)

D. Dirichlet By adding the missing term in ((40d)), we get the gradient


like expression of the posterior mean
Let the vector follow a Dirichlet dis-
tribution with shaping parameters. Then
we have

(38)

with B. Hidden Field Posterior

(41a)

APPENDIX B
DERIVATION OF VARIATIONAL BAYES POSTERIORS
We present herein the derivation of Variational Bayes poste-
rior of our problem. For the sake of simplicity, we will omit the
iteration number .

A. Image Conditional Posterior (41b)


From (24) we can write

(39a)
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
AYASSO AND MOHAMMAD-DJAFARI: JOINT NDT IMAGE RESTORATION AND SEGMENTATION 2275

D. Classes Means Posterior

(42a) (47a)

By completing the square in the first term of (42a) and cal-


culating the expectation with respect to all the posterior law
except , we obtain the posterior (47b)
with other normalizing terms
(47c)

Gathering these terms leads to the Gaussian expression given in


((29c), (29d)).

E. Classes Variance Posteriors


(43a)

By arranging the previous terms, we get the three terms com-


posing given in (28), with

(48a)

(44a)

C. Model Error Posterior


(48b)

(48c)
(45a)
With a similar technique to the one used in the model error pos-
terior , we get the posterior with

(45b)

(49)
(45c)
F. Singleton Energy Posteriors
Summing the last two terms lead us to the expression given in
(29a) and (29b), with

(50a)
(50b)

(50c)
(46)
Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
2276 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 19, NO. 9, SEPTEMBER 2010

Consequently, we obtain the same expression for given in [22] R. Molina, J. Mateos, and A. K. Katsaggelos, “Blind deconvolution
(29g). using a variational approach to parameter, image, and blur estimation,”
IEEE Trans. Image Process., vol. 15, no. 12, pp. 3715–3727, Dec. 2006.
[23] S. D. Babacan, R. Molina, and A. K. Katsaggelos, “Parameter
estimation in TV image restoration using variational distribution
ACKNOWLEDGMENT approximation,” IEEE Trans. Image Process., vol. 17, no. 3, p.
326, Mar. 2008.
The authors would like to thank S. Fekih-Salem for her [24] G. Chantas, N. Galatsanos, A. Likas, and M. Saunders, “Variational
careful reading and revision of the paper. bayesian image restoration based on a product of t-distributions image
prior,” IEEE Trans. Image Process., vol. 17, no. 10, pp. 1795–1805,
Oct. 2008.
[25] S. D. Babacan, R. Molina, and A. K. Katsaggelos, “Variational
REFERENCES bayesian blind deconvolution using a total variation prior,” IEEE
Trans. Image Process., vol. 18, no. 1, pp. 12–26, Jan. 2009.
[1] A. N. Tikhonov, “Solution of incorrectly formulated problems and the [26] D. G. Tzikas, A. C. Likas, and N. P. Galatsanos, “Variational bayesian
regularization method,” Sov. Math., pp. 1035–1038, 1963. sparse Kernel-based blind image deconvolution with student’s-t
[2] C. Bouman and K. Sauer, “A generalized Gaussian image model for priors,” IEEE Trans. Image Process., vol. 18, no. 4, pp. 753–764, Apr.
edge-preserving MAP estimation,” IEEE Trans. Image Process., vol. 2009.
2, no. 3, pp. 296–310, Jul. 1993. [27] H. Snoussi and A. Mohammad-Djafari, “Fast joint separation and seg-
[3] P. J. Green, “Bayesian reconstructions from emission tomography data mentation of mixed images,” J. Electron. Imag., vol. 13, p. 349, 2004.
using a modified EM algorithm,” IEEE Trans. Med. Imag., vol. 9, no. [28] O. Feron, B. Duchene, and A. Mohammad-Djafari, “Microwave
1, pp. 84–93, Mar. 1990. imaging of inhomogeneous objects made of a finite number of dielec-
[4] S. Geman and D. E. McClure, “Bayesian image analysis: Application to tric and conductive materials from experimental data,” Inv. Prob., vol.
single photon emission computed tomography,” Amer. Statist. Assoc., 21, no. 6, p. 95, 2005.
pp. 12–18, 1985. [29] F. Humblot, B. Collin, and A. Mohammad-Djafari, “Evaluation and
[5] G. Demoment, “Image reconstruction and restoration: Overview of practical issues of subpixel image registration using phase correlation
common estimation structures and problems,” IEEE Trans. Acoust., methods,” in Proc. Physics in Signal and Image Processing Conf.,
Speech, Signal Process., vol. 37, no. 12, pp. 2024–2036, Dec. 2005, pp. 115–120.
1989. [30] A. Mohammad-Djafari, “2D and 3D super-resolution: A Bayesian ap-
[6] A. K. Katsaggelos, “Digital image restoration,” in Proc. Statist. proach,” in Proc. AIP Conf., 2007, vol. 949, p. 18.
Comput. Sect., Berlin, Germany, 1991, vol. 6, Lecture Notes in Math- [31] W. R. Gilks, S. Richardson, and D. J. Spiegelhalter, Markov Chain
ematics 1832, pp. 12–18. Monte Carlo in Practice. London, U.K.: Chapman & Hall, 1996.
[7] A. K. Jain, Fundamentals of Digital Image Processing. Upper Saddle [32] A. Mohammad-Djafari, “Super-resolution: A short review, a new
River, NJ: Prentice-Hall, 1989. method based on hidden markov modeling of hr image and future
[8] S. Geman and D. Geman, Stochastic Relaxation, Gibbs Distributions challenges,” Comput. J., 2008, 10,1093/comjnl/bxn005:126-141.
and the Bayesian Restoration of Images. San Mateo, CA: Morgan [33] Z. Ghahramani and M. Jordan, “Factorial hidden Markov models,”
Kaufmann, 1987. Mach. Learn., no. 29, pp. 245–273, 1997.
[9] F. C. Jeng, J. W. Woods, and B. Morristown, “Compound [34] W. Penny and S. Roberts, “Bayesian neural networks for classification:
Gauss–Markov random fields for image estimation,” IEEE Trans. How useful is the evidence framework?,” Neural Netw., vol. 12, pp.
Signal Process., vol. 39, no. 3, pp. 683–697, Mar. 1991. 877–892, 1998.
[10] M. Nikolova, J. Idier, and A. Mohammad-Djafari, “Inversion of large- [35] S. Roberts, D. Husmeier, W. Penny, and I. Rezek, “Bayesian ap-
support illposed linear operators using a piecewise,” IEEE Trans. Image proaches to gaussian mixture modelling,” IEEE Trans. Pattern Anal.
Process., vol. 7, no. 4, pp. 571–585, Apr. 1998. Mach. Intell., vol. 20, no. 11, pp. 1133–1142, Nov. 1998.
[11] M. Nikolova, “Local strong homogeneity of a regularized estimator,” [36] W. Penny and S. Roberts, “Dynamic models for nonstationary signal
SIAM J. Appl. Math., vol. 61, no. 2, pp. 633–658, 2000. segmentation,” Comput. Biomed. Res., vol. 32, no. 6, pp. 483–502,
[12] M. Nikolova, “Thresholding implied by truncated quadratic regular- 1999.
ization,” IEEE Trans. Signal Process., vol. 48, no. 12, pp. 3437–3450, [37] W. Penny and S. Roberts, “Bayesian multivariate autoregresive models
Dec. 2000. with structured priors,” Proc. IEE Vis., Image Signal Process., vol. 149,
[13] G. Wang, J. Zhang, and G. W. Pan, “Solution of inverse problems in no. 1, pp. 33–41, 2002.
image processing by wavelet expansion,” IEEE Trans. Image Process., [38] S. Roberts and W. Penny, “Variational bayes for generalised autore-
vol. 4, no. 5, pp. 579–593, May 1995. gressive models,” IEEE Trans. Signal Process., vol. 50, no. 9, pp.
[14] J. M. Bioucas-Dias, “Bayesian wavelet-based image deconvolution: A 2245–2257, Sep. 2002.
GEM algorithm exploiting a class of heavy-tailed priors,” IEEE Trans. [39] W. Penny and K. Friston, “Mixtures of general linear models for
Image Process., vol. 15, no. 4, pp. 937–951, Apr. 2006. functional neuroimaging,” IEEE Trans. Med. Imag., vol. 22, no. 4, pp.
[15] G. E. Hinton and D. van Camp, “Keeping the neural networks simple by 504–514, Apr. 2003.
minimizing the description length of the weights,” in Proc. 6th Annu. [40] R. A. Choudrey and S. J. Roberts, “Variational mixture of bayesian
Conf. Computational Learning Theory, New York, 1993, pp. 5–13, independent component analysers,” Neural Computat., vol. 15, no. 1,
ACM. 2003.
[16] D. J. C. MacKay, “Ensemble learning and evidence maximization,” in [41] W. Penny, S. Kiebel, and K. Friston, “Variational bayesian inference
Proc. NIPS, 1995. for fmri time series,” NeuroImage, vol. 19, no. 3, pp. 727–741, 2003.
[17] M. I. Jordan, Z. Ghahramani, T. S. Jaakkola, and L. K. Saul, “An in- [42] N. Nasios and A. Bors, “A variational approach for bayesian blind
troduction to variational methods for graphical models,” Mach. Learn., image deconvolution,” IEEE Trans. Signal Process., vol. 52, no. 8, pp.
vol. 37, no. 2, pp. 183–233, 1999. 2222–2233, Aug. 2004.
[18] T. S. Jaakkola and M. I. Jordan, “Bayesian parameter estimation via [43] N. Nasios and A. Bors, “Variational learning for gaussian mixture
variational methods,” Statist. Comput., vol. 10, no. 1, pp. 25–37, 2000. models,” IEEE Trans. Syst., Man Cybern. B, Cybern., vol. 36, no. 4,
[19] , M. Opper and D. Saad, Eds., Advanced Mean Field Methods: Theory pp. 849–862, Aug. 2006.
and Practice. Cambridge, MA: MIT Press, 2001. [44] K. Friston, J. Mattout, N. Trujillo-Barreto, J. Ashburner, and W. Penny,
[20] H. Attias, “Independent factor analysis,” Neural Computat., vol. 11, no. “Variational free energy and the laplace approximation,” Neuroimage
4, pp. 803–851, 1999. 2006, (2006.08.035), Available Online.
[21] A. C. Likas and N. P. Galatsanos, “A variational approach for bayesian [45] R. A. Choudrey, “Variational Methods for Bayesian Independent
blind image deconvolution,” IEEE Trans. Signal Process., vol. 52, no. Component Analysis,” Ph.D. dissertation, Univ. Oxford, Oxford,
8, pp. 2222–2233, Aug. 2004. U.K., 2002.

Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.
AYASSO AND MOHAMMAD-DJAFARI: JOINT NDT IMAGE RESTORATION AND SEGMENTATION 2277

Hacheme Ayasso (S’08) was born in Syria in 1980. Ali Mohammad-Djafari (M’02) was born in Iran.
He received the engineer’s degree in electronic sys- He received the B.Sc. degree in electrical engi-
tems from the Higher Institute of Applied Science neering from Polytechnique of Teheran in 1975,
and Technology, (ISSAT), Damascus, Syria, in 2002, the diploma degree (M.Sc.) from Ecole Supérieure
the M.S. degree in signal and image processing from d’Electricité (SUPELEC), Gif sur Yvette, France in
the University of Paris-Sud 11, Orsay, France, in 1977, the “Docteur-Ingénieur” (Ph.D.) degree and
2007, and is currently working toward the Ph.D. the “Doctorat d’Etat” in Physics from the Université
degree at Paris-Sud 11 University in the inverse Paris Sud 11 (UPS), Orsay, France, in 1981 and
problems group (GPI) and Departement of Electro- 1987, respectively.
magnetics (DRE), part of laboratory of signals and He was Associate Professor at UPS for two years
systems, (L2S), Gif-sur-Yvette, France. (1981–1983). Since 1984, he has a permanent posi-
He was a research assistant in the electronic measurements group in ISSAT tion at “Centre National de la Recherche Scientifique (CNRS)” and works at
from 2003 to 2006, where he worked on non-destructive testing techniques. His “Laboratoire des Signaux et Systèmes (L2S)” at “SUPELEC.” From 1998 to
research interests include the application of Bayesian inference techniques for 2002, he has been at the head of Signal and Image Processing division at this
inverse problems, X-ray, and microwave tomographic reconstruction. laboratory. In 1997–1998, he has been visiting Associate Professor at University
of Notre Dame, South Bend. IN. Presently, he is “Directeur de Recherche” and
his main scientific interests are in developing new probabilistic methods based
on Bayesian inference, information theory and maximum entropy approaches
for inverse problems in general, and more specifically for signal and image re-
construction and restoration. His recent research projects contain: blind sources
separation (BSS) for multivariate signals (satellites images, hyperspectral im-
ages), data and image fusion, superresolution, x-ray computed tomography, mi-
crowave imaging and spatial-temporal positrons emission tomography (PET)
data and image processing. The main application domains of his interests are
computed tomography (x-rays, PET, SPECT, MRI, microwave, ultrasound, and
Eddy current imaging) either for medical imaging or for non-destructive testing
(NDT) in industry.

Authorized licensed use limited to: to IEEExplore provided by University Libraries | Virginia Tech. Downloaded on February 21,2025 at 02:05:09 UTC from IEEE Xplore. Restrictions apply.

You might also like