Camera calibration method based on circular array calibration board
Camera calibration method based on circular array calibration board
Haifeng Chen, Jinlei Zhuang, Bingyou Liu, Lichao Wang & Luxian Zhang
To cite this article: Haifeng Chen, Jinlei Zhuang, Bingyou Liu, Lichao Wang & Luxian Zhang
(2023) Camera calibration method based on circular array calibration board, Systems Science &
Control Engineering, 11:1, 2233562, DOI: 10.1080/21642583.2023.2233562
REVIEW ARTICLE
1. Introduction
three types namely, calibration method based on cali-
With the continuous expansion of computer vision appli- bration template (Tsai, 1986), calibration method based
cation fields, the application scenarios of 3D vision mea- on active vision (Maybank & Faugeras, 1992), and camera
surement are also expanding. Cameras are the most self-calibration method (Zhang & Tang, 2016). The so-
important sensors in machine vision, which has a wide called calibration method based on calibration template
range of applications in artificial intelligence (Graves et al., uses a calibration object with a known structure and high-
2009), vision measurement (Huang et al., 2017; Kanakam, precision as a spatial reference, establishes the constraint
2017; Liu et al., 2017), and robotics technology (Kahn et al., relationship between camera model parameters through
1990; Yang et al., 2007). As a key technology of visual mea- the correspondence between spatial points, and solves
surement, camera calibration plays a key role in the fields these parameters on the basis of the optimal algorithm.
of machine vision ranging, pose estimation, and three- parameter. Typical representative methods include direct
dimension (3D) reconstruction (Liu, 2001). The calibration linear transformation (DLT) (Abdel-Aziz, 2015) and Tsai
process establishes the transformation relationship from two-step method (Tsai, 1986). The calibration method
the 3D image coordinate system to the 3D world coor- based on the calibration template can obtain calibra-
dinate system (Sang, 2021). The accuracy of calibration tion with relatively high accuracy, but the processing and
parameters directly affects the accuracy of vision appli- maintenance of the calibration object is complicated, and
cations (Chen, 2020; Huang et al., 2020; Li et al., 2020). setting the calibration object in the harsh and danger-
Presently, domestic and foreign scholars have carried out ous actual operating environment is difficult. The cam-
extant research on camera calibration technology and era calibration method based on active vision refers to
have proposed many camera calibration algorithms (Qiu obtaining multiple images by actively controlling the
et al., 2000; Zhang et al., 2019). camera to perform some special motions on a platform
On the basis of the different number of vision sensors, that can be precisely controlled and using the images
existing camera calibration methods can be divided into collected by the camera and the motion parameters of
monocular vision camera calibration, binocular vision the controllable camera to determine the camera param-
camera calibration, and multi-vision camera calibration. eters. The representative method of this class is the linear
On the basis of the different calibration methods, the method based on two sets of three orthogonal motions
camera calibration methods can usually be divided into proposed by Massonde (Sang, 1996). Subsequently, Yang
et al. proposed an improved scheme, that is, based on asymmetric projection of the circle’s centre to calculate
four groups of plane orthogonal motion and give groups the centre coordinates of the ellipse after projection. The
of plane orthogonal motion, the camera is linearly cal- theoretical value and the actual value in the image are
ibrated by using the pole information in the image (Li matched by least squares, but the internal parameters
et al., 2000; Yang et al., 1998). This calibration method of the camera cannot be assumed in practical applica-
is simple to calculate and can generally be solved lin- tions, and the scope of application is small. Wu et al. (Wu
early and has good robustness, but the system cost is et al., 2018) proposed a circular mark projection eccen-
high, and it is not applicable when the camera motion tricity compensation algorithm based on three concentric
parameters are unknown or the camera motion cannot circles, which is calculated according to the eccentricity
be precisely controlled. In recent years, the camera self- model of three groups of ellipse fitting centre coordi-
calibration method proposed by many scholars can inde- nates and the amount of calculation is large; Lu et al. (Lu
pendently calibrate the reference object by only using the et al., 2020) proposed a high-precision camera calibra-
correspondence between multiple viewpoints and the tion method based on the calculation of the real image
surrounding environment during the natural movement coordinates of the centre of the circle, which obtains the
of the camera. This method has strong flexibility and high true centre of the circle through multiple iterative cal-
applicability, and it is usually used for camera parame- culations. However, the projection process is relatively
ter fitting based on absolute quadric or its dual absolute complex and requires a large amount of computation;
quadric (Wu & Hu, 2001). However, this method belongs Xie et al. (Xie & Wang, 2019) proposed a circle centre
to nonlinear calibration, and the accuracy and robustness extraction algorithm based on geometric features of dual
of the calibration results are not high. quadratic curves, but the computational complexity is
The camera calibration method of Zhang (Zhang, large; Peng et al. (Peng et al., 2022) proposed a method of
1999) requires shooting checkerboard calibration board plane transformation, which uses front and back perspec-
images from several angles. Because this method is sim- tive projection to obtain the coordinates of the landmark
ple and effective which is often used in camera calibration points, but it requires more manual work when select-
processes. However, in calibrating with checkerboard, the ing corner points and adjusting parameters. Aiming at
accuracy of corner extraction is greatly affected by noise the characteristics of the circular calibration plate, this
and image quality (Wu et al., 2013), whereas circular fea- paper proposes a camera calibration method on the basis
tures are not sensitive to segmentation thresholds, the of the circular array calibration plate. First, the sub-pixel
recognition rate is relatively high, and the projection of edge detection algorithm is used to detect the edge of
circles image noise has a strong known effect (Crom- the preprocessed image, then, according to the principle
brugge et al., 2021), so circular features have good appli- of finding the centre of the ellipse according to the geo-
cation prospects in vision systems (Rudakova & Monasse, metric constraints of the plane, the equation system is
2014). In the perspective projection transformation, when established to solve the position of the projection point of
the circular feature calibration plate is used for calibra- the circle’s centre, and finally, Zhang’s plane-based cam-
tion, the collected circle will be transformed into an era is used according to the coordinates of the circle’s
ellipse (referred to as a projection ellipse). Presently, the centre. Calibration method for camera calibration. The
positioning of projected ellipse has become a research experimental results show that the combination of the
hotspot in machine vision (Zhang et al., 2017), and its ellipse contour extraction algorithm and Zhang’s camera
positioning accuracy will directly affect the camera cali- calibration method can obtain higher camera calibration
bration accuracy and object measurement accuracy. The accuracy.
commonly used algorithms for ellipse extraction at this
stage are Canny detection least squares ellipse fitting
2. Image acquisition and preprocessing
method (Wang et al., 2016), Hough transform method
(Bozomitu et al., 2016; Ito et al., 2011), gray centre of grav- This article uses the thousand-eyed wolf 30 W pixel cam-
ity (Frosio & Borghese, 2008), Hu invariant moment (Hu, era of Fuhuang Junda Hi-Tech to take pictures and col-
1962), and other methods. The Hough transform method lect images. First, the camera is fixed on the stand and
has good anti-noise and strong robustness, but has a stand still, then, the calibration plate is moved and rotated
large amount of storage, high computational complex- with a 7 × 7 dot array, and 14 pictures of the calibration
ity, and poor pertinence; the gray-scale centroid method plate are taken with different poses and directions, 12 of
requires uniform gray levels, otherwise the error will be which are shown in Figure 1. First, the collected image
large (Zhang et al., 2017); Canny detection is the smallest. is converted into a grayscale image, then, the grayscale
The quadratic ellipse fitting method is fast and accurate image is edge preserved and denoized through guided
(Wang et al., 2016). Zhu et al. (Zhu et al., 2014) used the filtering, next, the image sharpening algorithm is used to
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 3
highlight the circular markers in the image, and then the Among them, S and P represent the area and perimeter
Canny operator is used to detect the image edge. Finally, of the graphic shape, respectively. When the circularity
the sub-pixel edge detection algorithm is used for edge C is 1, it means that the graphic shape is a perfect circle,
detection, and the results are shown in Figure 2, which and when the circularity C is 0, it means that the shape is
are (a) grayscale image, (b) denoised image, (c) enhanced a gradually elongated polygon. Therefore, the closer the
image, and (d) extractible image. feature points to be extracted in this paper are to a circle,
In the perspective projection transformation, when the the closer the value of circularity C is to 1.
circular feature calibration plate is used for calibration,
the collected circle will be transformed into an ellipse (2) Eccentricity
because it is in a non-parallel state with the camera. In this
paper, the sub-pixel edge detection algorithm is used to Eccentricity is the degree to which a conic deviate from
detect the edge of the collected image. Then, the circu- an ideal circle. The eccentricity of an ideal circle is 0, so
larity, eccentricity, and convexity conditions are restricted the eccentricity represents how different the curve is from
according to the characteristics of the obtained closed the circle. The greater the eccentricity, the less camber of
edge to extract the ellipse contour that meets the require- the curve. Among them, an ellipse with an eccentricity
ments. between 0 and 1 is an ellipse, and an eccentricity equal to
1 is a parabola. Given that directly calculating the eccen-
(1) Roundness tricity of a graphic is complicated, the concept of image
moment can be used to calculate the inertial rate of the
The roundness feature reflects the degree to which the graphic, and then the eccentricity can be calculated from
figure is close to a perfect circle, and its range is (0, 1). The the inertial rate. The relationship between the eccentricity
circularity C can be expressed as. E and the inertia rate I is:
(3) Convexity
V = S/H (3)
Figure 3. Schematic diagram of projective transformation.
In the formula, H represents the convex hull area corre-
sponding to the figure. The closer the convexity is to 1,
the closer the figure is to a circle. and D exist, in the plane, their intersection ratio can be
written as:
straight line equations, so the coefficient vectors of the obtained through the transformation between these four
straight lines, L1 , L2 , and L3 are. coordinate systems.
⎧ ⎡ ⎤ ⎡ ⎤⎡ ⎤
⎨L1 = T1 × M1 u ku 0 u0 x
L = T2 × M2
(14) ⎣ v ⎦ = ⎣ 0 kv v 0 ⎦ ⎣ y ⎦ (18)
⎩ 2
L3 = T3 × M3 1 0 0 0 1
⎡ ⎤ ⎡ ⎤⎡ ⎤
x fx 0 0 Xc
From the geometric relationship, the three lines, L1 , L2 , ⎣y ⎦ = ⎣ 0 fy 0⎦ ⎣Yc ⎦ (19)
and L3 , have the same point, then there are:
1 0 0 1 Zc
L2 • [L1 × L3 ] = 0 (15) In the formula, H1 is the internal parameter matrix, fx , fy ,
u0 , v0 are the parameters of the internal parameter matrix,
The intersection of L2 and L3 is the centre projection point
and ku and kv are the scale factors on the X-axis and Y-axis,
O’, O = L2 × L3 . The projection of the infinite straight line
respectively.
of the plane where the circle is located on the imaging
In summary, the conversion relationship between the
plane is a finite straight line. Formula (5) demonstrates
world coordinate system and the pixel coordinate system
that the projection equation is:
can be expressed as
l∞ = EO = E • [L2 × L3 ] (16) ⎡ ⎤
⎡ ⎤ ⎡ ⎤ Xw
u fx 0 u0 0 ⎢Yw ⎥
R T ⎢ ⎥
Given that all infinity points are on this infinity straight Zc ⎣v ⎦ = ⎣ 0 fy v0 0⎦ T
0 1 ⎣Zw ⎦
line, the point V3 ’ is on l∞ , then we have: 1 0 0 1 0
1
⎡ ⎤ ⎡ ⎤
l∞ • V3 = 0 (17) Xw Xw
⎢Yw ⎥ ⎢Yw ⎥
Formulas (13), (15), and (17) are combined to obtain a = H1 H2 ⎢ ⎥ ⎢ ⎥
⎣Zw ⎦ = H ⎣Zw ⎦ (20)
nonlinear equation system containing three unknowns α, 1 1
β and γ , and this nonlinear equation system is solved to
obtain the values of α, β and γ , and α, β and γ into the In the formula: Zc is the Z-axis coordinate value in the
expressions of the straight lines L1 , L2 , and L3 to obtain the camera coordinate system, and R and T represent the
coordinates of the projection point of the circle’s centre. rigid body transformation. H is a homography matrix,
which contains the camera internal parameter matrix
H1 and the external parameter matrix H2 . The internal
4. Camera calibration parameter matrix is only related to the camera’s own
Camera calibration is to determine the correspondence attributes and internal structure; the external parameter
between a certain point in space and its position in a matrix is completely determined by the mapping rela-
2D image by calculating the camera’s internal parameters tionship between the world coordinate system and the
and external coordinate system position parameters. The camera coordinate system.
calibration method used in this paper is Zhang’s plane Assuming that the ideal pixel coordinate is u v ,
calibration method. In the imaging geometry of the cam- because the camera is distorted during
the
shooting pro-
era, the linear imaging model of the camera describes cess, the real pixel coordinate is u v . Nonlinear dis-
the imaging process based on four coordinate systems, tortion is mainly divided into radial distortion, tangential
which are the world coordinate system, the camera coor- distortion, and centrifugal distortion, whereas Zhang’s
dinate system, the image physical coordinate system, and
the image pixel. Coordinate System. Let the homoge- Table 2. Average projection error of image.
neous coordinate of the world coordinate system of a
T Circular Circular
point P in space is Xw Yw Zw 1 , the coordinate mean Checkerboard mean Checkerboard
of the rigid body transformed into the camera coordi- Image error Mean error Image error Mean error
T number (pixels) (pixels) number (pixels) (pixels)
nate system is Xc Yc Zc , the coordinate of the per-
1 0.017408 0.136221 2 0.013896 0.14586
spective projection into the image imaging coordinate
T 3 0.015124 0.124842 4 0.018201 0.129999
system is x y z , and the corresponding pixel 5 0.013081 0.133154 6 0.013234 0.124160
coor-
7 0.012524 0.124391 8 0.016687 0.133391
dinate of the final projection into the image is u v . 9 0.017558 0.120564 10 0.014313 0.150970
The transformation relationship between the world coor- 11 0.015641 0.140890 12 0.013821 0.124783
13 0.016153 0.152299 14 0.013468 0.148469
dinate system and the image coordinate system can be
SYSTEMS SCIENCE & CONTROL ENGINEERING: AN OPEN ACCESS JOURNAL 7
plane calibration method only considers radial distortion. Table 4. Average projection error of image.
To improve the accuracy of camera calibration, this paper Circular Circular
not only obtains the radial distortion coefficients k1 , k2 , mean Checkerboard mean Checkerboard
Image error mean error Image error mean error
and k3 during calibration, but also obtains two tangential number (pixels) (pixels) number (pixels) (pixels)
distortion coefficients p1 and p2. The nonlinear distortion 1 0.014948 0.177772 2 0.014980 0.150344
model can be expressed as. 3 0.017036 0.153539 4 0.016154 0.154179
5 0.013988 0.158444 6 0.013338 0.153158
7 0.013955 0.151452 8 0.015043 0.141982
9 0.014111 0.140072 10 0.013256 0.165126
⎧ 2 4
11 0.014871 0.149872 12 0.013886 0.162394
⎪
⎪ x = x + x[k1 (x 2 + y2 ) + k2 (x 2 + y2 ) + k3 (x 2 + y2 ) ] 13 0.013313 0.160741 14 0.013468 0.173434
⎪
⎨ +[p1 (3x 2 + y2 ) + 2p2 xy]
⎪
⎪ y = y + y[k1 (x 2 + y2 ) + k2 (x 2 + y2 )2 + k3 (x 2 + y2 )4 ]
⎪
⎩
+[p2 (3x 2 + y2 ) + 2p1 xy] (1) Uneven illumination (Table 1 and Table 2):
(21)
In the formula, x y and x y represent the coor- ⎡
4896.563707 0
dinates of u v and u v in the image coordinate H1circular array = ⎣ 0 4893.757057
system, respectively. The radial distortion coefficients k1 , 0 0
k2 , k3 and the tangential distortion coefficients p1, p2 can ⎤
774.879906
be obtained by using the least square method. 549.748346⎦
1
the average re-projection error of the circle centre coor- force sensors. IEEE Transactions on Instrumentation and Mea-
dinates obtained by the improved calibration algorithm surement, 66(32), 3180–3189. https://doi.org/10.1109/TIM.20
is less than 0.12 pixels under any under any illumination 17.2746438
Huang, Z., Su, Y., & Wang, Q. (2020). Zhang C G. Research on
conditions. Compared with the use of checkerboard as external parameter calibration method of two-dimensional
the calibration object and Zhang’s plane-based camera lidar and visible light camera. Journal of Instrumentation.
calibration method for camera calibration, the calibra- https://doi.org/10.19650/j.cnki.cjsi.J2006756
tion results are more accurate, which meet the actual Ito, Y., Ogawa, K., & Nakano, K. (2011). Fast Ellipse Detection
calibration application requirements. Algorithm Using Hough Transform on the GPU. Proc. Int. Conf.
on Net. & Comput. (ICNC).
Kahn, P., Kichen, L., & Riseman, E. M. (1990). A fast line finder for
Disclosure statement vision-guided robot navigation. IEEE Transactions on Instru-
No potential conflict of interest was reported by the author(s). mentation and Measurement, 12(11), 1098–1102. https://doi.
org/10.1109/34.61710
Kanakam, T. M. (2017). Adaptable ring for vision-based measure-
Funding ments and shape analysis. IEEE Transactions on Instrumenta-
This work was supported in part by Academic support project tion and Measurement, 66(4), 746–756. https://doi.org/10.11
for top-notch talents in disciplines (majors) in Colleges and uni- 09/TIM.2017.2650738
versities (No. gxbjzd2021065), and in part by Anhui Polytech- Li, H., Wu, F. & Hu, Z. (2000). A new self-calibration method for lin-
nic University – Jiujiang District Industrial collaborative innova- ear camera. Journal of Computer Science, 23(11), 9. https://doi.
tion special fund project, “Research on high precision collab- org/10.19650/j.cnki.cjsi.J2005999
orative control system of multi degree of freedom robot” (No. Li, M., Ma, K., Xu, Y., & Wang, F. (2020). Research on error
2021cyxtb2), and in part by Wuhu key R & D project “R & D and compensation method of morphology measurement based
application of key technologies of robot intelligent detection on monocular structured light. Journal of Instrumentation,
system based on 3D vision” (No. 2021yf32). 41(05|5). https://doi.org/10.19650/j.cnki.cjsi.J2005999
Liu, K., Wang, H., Chen, H., Qu, E., Tian, Y., & Sun, H. (2017).
Steel surface defect detection using a new haar-weibull-
Data availability statement variance model in unsupervised manner. IEEE Transactions
The data used to support the findings of this study are available on Instrumentation and Measurement, 6(10), 2585–2596.
from the corresponding author upon request. https://doi.org/10.1109/TIM.2017.2712838
Liu, Y. (2001). Accurate calibration of standard plenoptic cam-
eras using corner features from raw images. Optics Express,
References 21(1), 158–169. https://doi.org/10.1364/OE.405168
Abdel-Aziz, Y. L. (2015). Direct linear transformation from Lu, X., Xue, J., & Zhang, Q. (2020). A high-precision camera
comparator coordinates into object space coordinates in calibration method based on the calculation of real image
close-range photogrammetry. Photogrammetric Engineering coordinates at the center of a circle. China Laser, 47(3).
& Remote Sensing Journal of the American Society of Pho- https://kns.cnki.net/kcms/detail/31.1339.
togrammetry. https://doi.org/10.14358/PERS.81.2.103 Maybank, S. J., & Faugeras, O. D. (1992). A theory of self-
Bozomitu, R. G., Pasarica, A. & Cehan, V. (2016). Implementation calibration of a moving camera. International Journal of Com-
of eye-tracking system based on circular Hough transform puter Vision, 8(2), 123–151. https://doi.org/10.1007/BF001
algorithm. Proc. 5th Conf. on E-Health and Bioe. (EHB). 27171
Chen, W. (2020). Research on calibration method of industrial Peng, Y., Guo, J., Yu, C., & Ke, B. (2022). High precision
robot vision system based on halcon. Electronic Measurement camera calibration method based on plane transforma-
Technology, 43(21). https://doi.org/10.19651/j.cnki.emt.2004 tion. Journal of Beijing University of Aeronautics and Astronau-
852 tics. https://doi.org/10.13700/j.bh.1001-5965.2021.0015
Crombrugge, I. V., Penne, R., Vanlanduit S. (2021). Extrinsic Qiu, M., Ma, S. & Li, Y. (2000). Overview of camera calibra-
camera calibration with line-laser projection. Sensors-Basel, tion in computer vision. Journal of Automotive Technology.,
21(4). https://doi.org/10.3390/s21041091 26(1). https://doi.org/10.16383/j.aas.2000.01.006
Frosio, I., & Borghese, N. A. (2008). Real-time accurate circle fit- Rudakova, V. & Monasse, P. (2014). Camera matrix calibration
ting with occlusions. Pattern Recognition, 41(3), 1041–1055. using circular control points and separate correction of the
https://doi.org/10.1016/j.patcog.2007.08.011 geometric distortion field. Proc. 11th Conf. on Comput. & Rob.
Graves, A., Liwicki, M., Fernandez, S., Bertolanmi, R., Bunke, H., Vis. (CRV).
& Schmidhuber, J. (2009). A novel connectionist system for Sang, D. M. (1996). A self-calibration technique for active vision
unconstrained hand-writing recognition. IEEE Transactions systems. IEEE Transactions on Robotics and Automation, 12(1),
on Pattern Analysis and Machine Intelligence, 31(5), 855–868. 114–120. https://doi.org/10.1109/70.481755
https://doi.org/10.1109/TPAMI.2008.137 Sang, J. (2021). Constrained multiple planar reconstruction for
Hu, M. K. (1962). Visual pattern recognition by moment invari- automatic camera calibration of intelligent vehicles. Sensors,
ants. Information Theory, IRE Transactions, 8(2), 179–187. 21(14), 4643–4643. https://doi.org/10.3390/s21144643
https://doi.org/10.1109/TIT.1962.1057692 Tsai, R. Y. (1986). An efficient and accurate camera calibration
Huang, X., Zhang, F., Li, H., & Liu, X. (2017). An online technology technique for 3D machine vision. Proc. of Comp. vis. Patt.
for measuring icing shape on conductor based on vision and Recog. (CVPR).
10 H. CHEN ET AL.
Wang, W., Zhao, J., Hao, Y., Zhang XL. (2016). Research on non- Yang, J., Zhang, D., Yang, J. Y., & Niu, B. (2007). Globally maximiz-
linear least squares ellipse fitting based on Levenberg Mar- ing, locally minimizing: Unsupervised discriminant projection
quardt algorithm. Proc. 13th Int. Conf. on Ubiquit. Rob. and with applications to face and palm biometrics. IEEE Trans-
Amb. Intel. (URAI). actions on Pattern Analysis and Machine Intelligence, 12(4),
Wu, F., & Hu, Z. (2001). Linear theory and algorithm of cam- 650–664. https://doi.org/10.1109/TPAMI.2007.1008
era self-calibration. Journal of Computer Science, 24(011|11), Zhang, M., Yang, Y., & Qin, R. (2017). Dynamic adaptive genetic
1121–1135. https://doi.org/10.3321/j.issn:0254-4164.2001. algorithm camera calibration based on circular array tem-
11.001 plate. Proc. 36th Chinese Control Conf. (CCC).
Wu, F., Liu, J., & Ren, X. (2013). Calibration method of panoramic Zhang, M. Y., Zhang, Q., & Duan, H. (2019). Pose self-calibration
camera for deep space exploration based on circular marker method of monocular camera based on motion trajectory.
points. Journal of Optics, 11. https://doi.org/CNKI:SUN:GXXB. Journal of Huazhong University of Science and Technology:
0.2013-11-023 Natural Science Edition. https://doi.org/10.13245/j.hust.190
Wu, J., Jiang, L., Wang, A., & Yu, P. (2018). Offset compensa- 212
tion algorithm for circular sign projection. Chinese Journal of Zhang, Z. (1999). Flexible camera calibration by viewing a plane
Image and Graphics, 23(10). https://doi.org/CNKI:SUN:ZGTB. from unknown orientations. Proc. IEEE 7th Int. Conf. on Com-
0.2018-10-010 put. Vis. (ICCV).
Xie, Z., & Wang, X. (2019). Center extraction of planar calibration Zhang, Z., & Tang, Q. (2016). Camera self-calibration based on
target marker points. Optics and Precision Engineering, 27(02), multiple view images. Proc. Nicograph International, Hanzhou,
440–449. https://doi.org/10.3788/OPE.20192702.0440 China.
Yang, C., Wang, W., & Hu, Z. (1998). A self-calibration method of Zhu, W., Cao, L., & Mei, B. (2014). Accurate calibration of industrial
camera internal parameters based on active vision. Journal of camera using asymmetric projection of circle center. Opti-
Computer Science, 21(5). https://doi.org/10.3321/j.issn:0254- cal Precision Engineering, 22(8). https://doi.org/10.3788/OPE.
4164.1998.05.006 20142208.2267