A Comparative Study of Relevant Vector Machine and
A Comparative Study of Relevant Vector Machine and
net/publication/261024059
CITATIONS READS
3 1,243
4 authors, including:
Yu Liu
University of Electronic Science and Technology of China
38 PUBLICATIONS 915 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Yi Shi on 08 June 2018.
Abstract—Relevant Vector Machine (RVM) and Support Vector optimization, whereas RVM is constructed by exploiting a
Machine (SVM) are two relatively new methods that enable us to probabilistic Bayesian learning framework [8].
utilize a few experimental sample points to construct an explicit
metamodel. They have been extensively employed in both Although both SVM and RVM have been widely studied
classification and regression problems. However, their and exhibit good performance in classification and regression
performance in uncertainty analysis is rarely studied. The focus under the deterministic context [8],[11], their application to
of this paper is to compare the two metamodeling techniques in uncertainty analysis has rarely been compared in literature. It is
terms of uncertainty analysis. necessary to examine their performance in uncertainty analysis.
This paper serves this purpose. Rather than comparing SVM
Keywords-relevant vector machine; support vector machine; and RVM in terms of capturing the global tendency, we
uncertainty analysis; reliability analysis; comparative study conduct a comparative study to examine whether the outputs of
both methods learned by a few training sample points can
I. INTRODUCTION accurately capture performance variations due to small
Methods of uncertainty analysis that quantify uncertainty perturbations of input parameters. The first four statistical
(mean, variance, or reliability) in system output performance moments, i.e. mean, standard deviation, skewness, and
based on random or noisy inputs are essential to design under kurtosis, are used as the uncertainty quantities of interests in
uncertainty. Numerous of methods for uncertainty analysis our comparative study. A two-bar structure problem issued to
have been developed recently with aim to improve the examine the performance of the two metamodeling techniques.
accuracy of uncertainty analysis and reduce the associated In addition, comparisons are also conducted with respect to
computational burden, such as full factorial numerical different sample sizes.
integration [1], univariate dimension reduction [2], sparse grid The rest of this paper is organized as follows: The basic
[3], stochastic response surface [4], weighted stochastic principles of SVM and RVM will be briefly reviewed in
response surface [5]. With the popularity of simulation tools, Section II. The error measures used for both deterministic and
such as finite element analysis and computational fluent uncertainty analysis are introduced in Section III. A two-bar
dynamics, it is generally very computational expensive to structure problem is presented in Section IV to conduct a
conduct uncertainty analysis by directly using the simulation comparative study. Section V is a brief conclusion and
model. Metamodel-based uncertainty analysis approach that remarks.
approximates the expensive true simulation model with a
cheap surrogate model is one way to overcome this issue [6]. II. SUPPORT VECTOR MACHINE AND RELEVANT VECTOR
The time-consuming model is replaced by a metamodel with MACHINE
which MCS can be executed directly to gain the mean, variance,
reliability, and high order of statistical moment. Support Vector A. Support Vector Machine
Machine (SVM) and Relevant Vector Machine (RVM) are two SVM is one kind of machine learning-method for both
relatively new metamodeling techniques. Both of them can be classification and regression. The prominent feature of SVM is
used to construct surrogate models for regression and that it is capable of capturing complicated decision functions or
classification. The fundamentals of these two methods are response functions, even using very limited sample points.
slightly different. SVM obtains the result through quadratic
470
weights cannot be obtained analytically, thus Laplace's method
based approximation has to be used [8]. , 15
2 8
III. METRICS FOR PERFORMANCE MEASURES and it is the decision function that is going to be approximated
In order to quantitatively compare theperformance of SVM by RVM and SVM.
and RVM in uncertainty analysis, metrics should be defined The approximated decision function is obtained by using
first. the training sample points generated from the IHS method. In
A. Metric for Regression this example, it is expected that the value of noise follows a
standard normal distribution. In both RVM and SVM, the
In this paper, the accuracy of SVR and RVM in terms of
optimal value of σ in the Gaussian kernel is set to be 13,
regression is compared by using relative mean square error
and ε in SVR equals to 0.001. The variables and are
(RMSE), which indicates that how accurate the metamodel is
continuous and lie in [25, 45] and [600, 610], respectively.
over the entire region of interest. The formulation of RMSE is
Different sizes of training sample points are used to construct
[12]:
RVM and SVM, and the results are tabulated in Table I.
1
RMSE 13
[mm] and the two-bar structure H = [mm]. Given other 60% RMSE
constant parameters [7]: 50% Mean
40% Std
Normal stress limit: = 400N/mm2, Skewness
30%
External force: F = 150 kN, 20% Kurtosis
10%
Elastic modulus: E = 210 KN/mm2,
0%
Width of the structure: B = 750mm, 30 40 50 60
Size of sample points
Thickness of the cross section: T = 2.5mm; Figure 2. Percentage that RVM outperforms SVM over 50 trails (RMSE of
the first four moments of RVM is smaller than that of SVM).
The buckling stress constraint is defined as:
From Table I, it is obvious that large number of sample
points definitely increase the accuracy of RVM and SVM for
471
both deterministic and uncertainty problem. The RMSE of both Doctoral Program of Higher Education of China under
estimated response and moments decreases with the increase contract number 20110185120014, and the Fundamental
of the amount of training samples. In Fig. 2, it is clearly Research Funds for the Central Universities under contract
observed that even though the global performance of RVM is number ZYGX2011J084.
much better than SVM in the deterministic context, the
performance of SVM for uncertainty analysis has greater REFERENCES
chance to be superior to RVM when the size of samples is [1] H. S. Seo and B. M. Kwak, “Efficient statistical tolerance analysis for
increasing, especially for high order moments estimation general distributions using three-point information,” International Journal
(skewness and kurtosis). of Production Research, vol. 40, pp. 931-944, 2002.
In this example, RMSE of both RVM and SVM for [2] S. Rahman and D. Wei, “A univariate approximation at most probable
point for higher-order reliability analysis,” International Journal of Solids
uncertainty analysis keeps declining as the sample size is and Structures, vol. 43, pp. 2820-2839, 2006.
increasing. Evidently, RVM outperforms SVM in terms of
[3] F. F. Xiong, S. Greene, Y. Xiong, W. Chen, and S. X. Yang, “A new
capturing the first four moments of the response in the case sparse grid based method for uncertainty propagation,” Structural and
where the sample size is small. However, as the sample size Multidisciplinary Optimization, vol. 41, no. 3, pp. 335-349, 2010.
increase, SVM becomes superior to RVM especially for high [4] F. F. Xiong, W. Chen, Y. Xiong, and S. X. Yang, “Weighted stochastic
order moments. From the computational efficiency point of response surface method considering sample weights,” Structural and
view, it is also found that the number of SVs is nearly the same Multidisciplinary Optimization, vol. 43, no. 6, pp. 837-849, 2011.
with the number of training samples, which means that in [5] F. F. Xiong, Y. Liu, and S. X. Yang, “A double weighted stochastic
regression problem, the computational cost of SVM is much response surface method for reliability analysis,” Journal of Mechanical
greater than RVM. Additionally, RVM provides estimated Science and Technology, vol. 26, no. 8, pp. 2573-2580, 2012.
noise to help us understand the potential uncertainty of the [6] F. Jurecka, Robust Design Optimization Based on Metamodeling
Techniques. PhD dissertation, Lehrstuhl für Statik der Technischen
estimated decision function. This value is becoming closer to
Universität München, 2007.
the real noise level when the training set is increased; however
[7] R. Jin, X. Du, and W. Chen, “The use of metamodeling techniques for
it will eventually become stable after the number of training optimization under uncertainty,” Structural and Multidisciplinary
samples reaches certain value. Optimization, vol. 25, no. 2, pp. 99-116, 2003.
[8] M. E. Tipping, “Sparse Bayesian learning and the relevance vector
V. CONCLUSION machine,” Journal of Machine Learning Research vol. 1, pp. 211-244,
In this paper, RVM and SVM are compared from the 2001.
accuracy of metamodel and uncertainty analysis, respectively. [9] A. Basudhar and S. Missoum, “Adaptive explicit decision functions for
Regarding the accuracy of SVM and RVM, it is found that probabilistic design and optimization using support vector machines,”
Computers and Structures, vol. 86, pp. 1904-1917, 2008.
RVM is relatively better than SVM in the overall performance
approximation in the deterministic context. For uncertainty [10] A. Basudhar, S. Missoum, and A. H. Sanchez, “Limit state function
identification using Support Vector Machines for discontinuous responses
analysis, the size of training set has a large influence on the and disjoint failure domains,” Probabilistic Engineering Mechanics, vol.
accuracy. The more training sample points there are, the better 23, pp. 1-11, 2008.
the accuracy of both metamodels will be. Although RVM are [11] S. M. Clarke, J. H. Griebsch, and T. W. Simpson, “Analysis of support
more accurate in estimating the first four statistical moments vector regression for approximation of complex engineering analyses,”
with small sample size, with the increase of sample size, the Journal of Mechanical Design- Transaction of the ASME, vol. 127, pp.
accuracy of SVM in predicting the first four moments is 1077-1087, 2005.
obviously improved and outperforms that of RVM. [12] P. Zhu, F. Pan, W. Chen, and S. L. Zhang, “Use of support vector
regression in structural optimization: application to vehicle
ACKNOWLEDGMENT crashworthiness design,” Mathematics and Computers in Simulation, vol.
86, pp. 21-31, 2012.
The authors greatly acknowledge grant supports from the
National Natural Science Foundation of China under contract
numbers 51105034, the Specialized Research Fund for the
472