Stereo Camera Calibration For Large Field of View Digital Image Correlation by Gao
Stereo Camera Calibration For Large Field of View Digital Image Correlation by Gao
Measurement
journal homepage: www.elsevier.com/locate/measurement
Stereo camera calibration for large field of view digital image correlation
using zoom lens
Zeren Gao a , Yue Gao b , Yong Su a , Yang Liu a , Zheng Fang a , Yaru Wang a , Qingchuan Zhang a ,∗
a
Key Laboratory of Mechanical Behavior and Design of Materials of Chinese Academy of Science, University of Science and Technology of
China, Hefei 230027, China
b
Beijing Institute of Structure and Environment Engineering, Beijing 100076, China
Keywords: We propose a calibration method for stereo camera with a large field of view (FOV) using zoom lenses.
Camera calibration We adopted Magill’s formula, using intrinsic parameters with small object distances to calculate intrinsic
Large field of view parameters with large object distances. We employed a regular sized calibration board to calibrate extrinsic
Digital image correlation
parameters in the small FOV, using zoom lenses to reduce the FOV, and then calculated extrinsic parameters
Zoom lens
in the large FOV. Verification experiments showed 3D reconstruction error was approximately 1 mm when
the FOV is about 6 × 6 m. The proposed calibration method will broaden the application range for 3D digital
image correlation in scientific and engineering fields.
∗ Corresponding author.
E-mail address: [email protected] (Q. Zhang).
https://doi.org/10.1016/j.measurement.2021.109999
Received 17 December 2020; Received in revised form 30 July 2021; Accepted 1 August 2021
Available online 9 August 2021
0263-2241/© 2021 Elsevier Ltd. All rights reserved.
Z. Gao et al. Measurement 185 (2021) 109999
method using conventional size calibration board and zoom lenses. specimen surfaces are calculated by combining intrinsic parameters and
Camera lens combinations are usually determined based on the applied relative positions between the two cameras (calibrated beforehand).
FOV and working distance. Lens used in the image measurement field The process to obtain intrinsic and extrinsic parameters for the camera
are usually prime lenses because they provide better image quality than pair is called stereo-camera calibration.
zoom lenses. Thanks to the advances in modern lens design capabilities, We use the pinhole camera model, forming scene views by project-
zoom lens image quality is now available for most measurement scenar- ing 3D points onto an image plane using a perspective transformation.
ios. The FOV at the telephoto end of a zoom lens is very small, and the This transformation can be expressed in homogeneous coordinates as
calibration board can be correspondingly small. We can calibrate the ⎡𝑋 ⎤
extrinsic parameters of the stereo camera by using the images taken at ⎡𝑥𝑖 ⎤ ⎡𝑓𝑥 𝑓𝑠 𝑐𝑥 ⎤ ⎡𝑅11 𝑅12 𝑅13 𝑡𝑥 ⎤ ⎢ 𝑤 ⎥
𝑌
the telephoto end. 𝛼 ⎢ 𝑦𝑖 ⎥ = ⎢ 0 𝑓𝑦 𝑐𝑦 ⎥ ⎢𝑅21 𝑅22 𝑅23 𝑡𝑦 ⎥ ⎢ 𝑤 ⎥ , (1)
⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ 𝑍𝑤 ⎥
However, this raises the issue of how to calibrate the intrinsic and ⎣1⎦ ⎣ 0 0 1 ⎦ ⎣𝑅31 𝑅32 𝑅33 𝑡𝑧 ⎦ ⎢ ⎥
⎣ 1 ⎦
extrinsic parameters at the short focal end (large FOV), and we divide
this into two parts. For intrinsic parameters, Magill’s formula [36] where (𝑐𝑥 , 𝑐𝑦 ) is a principal point, usually the image center; (𝑓𝑥 , 𝑓𝑦 )
shows that image distortion is related to object distance, and image are the image distance expressed in pixels; 𝑓𝑠 is the non-perpendicular
distance is also related to object distance. Therefore, we can use a series angle for the camera sensor array; and 𝛼 is a nonzero scale factor
of camera intrinsic parameters under close range imaging to calculate (these parameters and distortion parameters are called camera intrinsic
parameters); 𝑅| 𝑇 is the transformation between the world and prin-
camera intrinsic parameters under long distance imaging in large-FOV
cipal point coordinate systems, which are called extrinsic parameters;
experiments. We can then use a conventional sized calibration board for
(𝑋𝑤 , 𝑌𝑤 , 𝑍𝑤 )𝑇 are the coordinates for a 3D point in the world coordi-
extrinsic parameters to calibrate rotation and translation matrix at the
nate system, and (𝑥𝑖 , 𝑦𝑖 )T are the coordinates for the projection point in
telephoto end. For the short focal length end, the rotation translation
pixels.
matrix can be calculated using extrinsic parameters at the telephoto
We define the transformation from the world to left camera princi-
end and image distance changes.
pal point coordinate system as 𝑅𝑙 || 𝑇 𝑙 , the corresponding transformation
The remainder of this article is arranged as follows. Section 2
for the right camera as 𝑅𝑟 | 𝑇 𝑟 , and the transformation from the left
introduces Magill’s formula and builds a geometric model for a stereo to right camera principal point coordinate system as 𝑅ℎ || 𝑇 ℎ (Fig. 2).
camera with zoom lenses. Section 3 details accuracy verification ex- We use the plane model calibration method to obtain 𝑅ℎ || 𝑇 ℎ [28],
periments for the proposed method. Finally, Section 4 summarizes and which requires two cameras to take a set of calibration board pictures
concludes the paper. simultaneously. We then solve for 𝑅ℎ || 𝑇 ℎ by calibrating 𝑅𝑙 || 𝑇 𝑙 and
𝑅𝑟 | 𝑇 𝑟 , and 𝑅ℎ || 𝑇 ℎ is optimized by minimizing re-projection error.
2. Methodology The world coordinate system (𝑋𝑤 , 𝑌𝑤 , 𝑍𝑤 )𝑇 is artificially defined,
hence to simplify calculation we define the world coordinate system
It becomes difficult to make a calibration board that matches the to coincide with the principal point coordinate system for the left
FOV when the measurement FOV is large. Therefore, we propose to camera. Therefore, 𝑅 becomes a unit matrix and 𝑇 a zero vector for
calibrate intrinsic and extrinsic parameters separately. the left camera in (1), and 𝑅| 𝑇 is the transformation from the principal
Fig. 1 shows the proposed calibration process for a dual camera coordinate system for the left camera to that for the right camera. Thus,
system with zoom lens. To calibrate the intrinsic parameters, we need (1) can be simplified to
to obtain object distance parameters and intrinsic parameters at various ⎧ 𝑙 𝑓𝑥𝑙 𝑋𝑤 𝑓𝑠𝑙 𝑌𝑤
𝑙
distances under conventional FOV conditions, and then calculate the ⎪𝑥𝑖 = 𝑐𝑥 + 𝑍𝑤 + 𝑍𝑤
intrinsic parameters under large a FOV using Magill’s formula. We then ⎪ 𝑙
⎪𝑦𝑙 = 𝑐 𝑙 + 𝑓𝑦 𝑌𝑤
use a zoom lens’s telephoto end to calibrate extrinsic parameters with ⎪ 𝑖 𝑦 𝑍𝑤
a conventional sized calibration board, and subsequently calculate the ⎨ 𝑓 𝑟 (𝑅ℎ 𝑋𝑤 +𝑅ℎ12 𝑌𝑤 +𝑅ℎ13 𝑍𝑤 +𝑡ℎ𝑥 ) 𝑓𝑠𝑟 (𝑅ℎ21 𝑋𝑤 +𝑅ℎ22 𝑌𝑤 +𝑅ℎ23 𝑍𝑤 +𝑡ℎ𝑦 )
(2)
⎪𝑥𝑟𝑖 = 𝑐𝑥𝑟 + 𝑥 ℎ 11 +
short focus end’s extrinsic parameters. Finally, the intrinsic and ex- ⎪ 𝑅31 𝑋𝑤 +𝑅ℎ32 𝑌𝑤 +𝑅ℎ33 𝑍𝑤 +𝑡ℎ𝑧 𝑅ℎ31 𝑋𝑤 +𝑅ℎ32 𝑌𝑤 +𝑅ℎ33 𝑍𝑤 +𝑡ℎ𝑧
trinsic parameters are combined and optimized to obtain the complete ⎪ 𝑟 ℎ ℎ ℎ
𝑓 (𝑅 𝑋𝑤 +𝑅22 𝑌𝑤 +𝑅23 𝑍𝑤 +𝑡𝑦 ) ℎ
⎪𝑦𝑟𝑖 = 𝑐𝑦𝑟 + 𝑦 ℎ 21
calibration parameters. ⎩ 𝑅31 𝑋𝑤 +𝑅ℎ32 𝑌𝑤 +𝑅ℎ33 𝑍𝑤 +𝑡ℎ𝑧
2
Z. Gao et al. Measurement 185 (2021) 109999
Table 1
Intrinsic and distortion parameters at different distance.
Distance (mm) 𝑐𝑥 𝑐𝑦 𝑓𝑥 𝑓𝑦 𝑘1
100 1064.798 983.306 4503.965 4502.986 −0.525
152 1059.192 971.609 4360.746 4360.514 −0.382
178 1049.711 974.095 4312.999 4311.592 −0.347
202 1090.903 965.569 4287.47 4287.458 −0.309
254 1091.64 951.801 4249.264 4249.234 −0.28
329 1075.333 979.008 4205.814 4205.434 −0.238
404 1077.445 975.855 4179.425 4179.473 −0.215
509 1068.332 972.947 4166.936 4167.032 −0.19
650 1079.033 986.8 4131.49 4131.34 −0.164
900 1072.913 982.778 4127.8 4127.55 −0.154
suitable for higher order models. Based on our previous work [18], the
second-order radial distortion model is suitable for most lenses.
We can use a ruler or laser rangefinder to measure distance 𝐷
Fig. 2. Coordinate system for the proposed 3D DIC approach. from the camera to the object of interest, expressing object distance as
𝑆 = 𝐷 + 𝑑, where 𝑑 is an unknown constant. If 𝑓 is the image distance,
then
imaging distance, so 𝑓𝑠 was not included in the following discussion. 1 1 1
+ = , (8)
The imaging model for pinhole imaging with radial distortion can be 𝐷+𝑑 𝑓 𝐹
described as where 𝑑 and 𝐹 are unknown quantities. Since 𝑓 is normally expressed
⎡𝑥⎤ ⎡𝑋𝑤 ⎤ in pixels, we need the conversion 𝛽 between pixels and millimeters and
⎢𝑦 ⎥ = 𝑅 ⎢𝑌 ⎥ + 𝑡 express (8) in millimeters,
⎢ ⎥ ⎢ 𝑤⎥
⎣𝑧 ⎦ ⎣𝑍𝑤 ⎦ 1 1 1
𝑥′ = 𝑥∕𝑧 + = (9)
𝐷+𝑑 𝑓 ∙𝛽 𝐹
𝑦′ = 𝑦∕𝑧 (3)
We can obtain 𝛽 from factory parameters for the camera or iteratively
𝑥′′ = 𝑥′ (1 + 𝑘1 𝑟2 + 𝑘2 𝑟4 )
calculate it directly (grayscale camera, 𝛽 = 3.69 μm/pixel). For exam-
𝑦′′ = 𝑦′ (1 + 𝑘1 𝑟2 + 𝑘2 𝑟4 )
ple, (𝛽, 𝑑, 𝐹 ) can be obtained by parameter optimization over three sets
𝑢 = 𝑓𝑥 ∗ 𝑥′′ + 𝑐𝑥
of calibration data at different distances. Let
𝑣 = 𝑓𝑦 ∗ 𝑦′′ + 𝑐𝑦
1 1 1
where (𝑥, 𝑦, 𝑧), (𝑥′ , 𝑦′ ), (𝑥′′ , 𝑦′′ ), and (𝑢, 𝑣) represent coordinates in the ℎ𝑖 (𝛽, 𝑑, 𝐹 ) = + − (10)
𝐷𝑖 + 𝑑 𝑓𝑖 ∙ 𝛽 𝐹
principal point system, imaging plane, imaging plane with distortion,
and pixel coordinates with distortion, respectively. A large FOV makes be the objective function, then we can obtain (𝛽, 𝑑, 𝐹 ) by solving
it difficult to create a large calibration board to calibrate the intrinsic ∑
𝑛
parameters. Therefore, we used Magill’s formula, to infer intrinsic (𝛽, 𝑑, 𝐹 ) = 𝑎𝑟𝑔𝑚𝑖𝑛 (ℎ𝑖 )2 . (11)
𝑖=1
parameters under actual measurement conditions by calibrating the
intrinsic parameters under several close distances. Magill’s formula can Thus, we obtain the object distance (𝑆), and 𝑓𝑥 and 𝑓𝑦 can be
be expressed as calculated by measuring 𝐷 between the camera and object of interest
in a large FOV.
𝛿𝑟𝑠 = 𝛿𝑟−∞ − 𝑚𝑠 𝛿𝑟∞ , (4)
In order to verify the validity of Magill’s formula, we carried out a
with magnification verification experiment. We calibrated the distortion parameters at ten
𝐹 distances (Table 1). As shown in Fig. 3, through the above method, the
𝑚𝑠 = , (5) first nine groups of data were used to calculate the distortion of the
(𝑆 − 𝐹 )
tenth group, and the results showed a high degree of agreement.
where 𝐹 is the lens focal length; 𝑆 is the distance to the object plane
that the lens is focused onto; 𝛿𝑟𝑠 is the focal distortion on the object All the intrinsic parameters except (𝑐𝑥 , 𝑐𝑦 ) can be calculated by
plane; 𝛿𝑟∞ is the distortion for infinity focus; and 𝛿𝑟−∞ is the distortion the proposed method. Since optical center coordinates are caused by
for inverted infinity focus (i.e., if the lens is reversed so that front misalignment between sensor and installation port center positions, the
element becomes rear element and vice versa). values do not correlate with the imaging distance. We use the average
In order to facilitate the calculation of the distortion parameters, value of the optical center coordinates obtained in conventional FOV
Substituting (5) into (4), as the initial value of (𝑐𝑥 , 𝑐𝑦 ) under large FOV.
𝐹
𝛿𝑟𝑠 = 𝛿𝑟−∞ − 𝛿𝑟 , (6)
(𝑆 − 𝐹 ) ∞ 2.3. Extrinsic parameters calibration
establishing a parametric expression for the distortion parameters with
respect to the object distance, For zoom lens we use the telephoto end to calibrate the intrinsic
and extrinsic parameters for a camera pair with a regular sized cali-
(𝑆𝑖 − 𝐹 )𝛿𝑟𝑠 = (𝑆𝑖 − 𝐹 )𝛿𝑟−∞ − 𝐹 𝛿𝑟∞ , (7)
bration board. Fig. 4 shows the FOV for a stereo camera at different
We can obtain 𝛿𝑟−∞ and 𝛿𝑟∞ by solving (7) using least squares, and focal lengths. Both intrinsic and extrinsic parameters change as the
subsequently use these for (6) to predict distortion parameters (𝑘1 , 𝑘2 ) cameras are adjusted from telephoto to short focal conditions. intrinsic
at any distance. parameters can be calculated from the object distance, as described
For the convenience of derivation, a second-order radial distortion in Section 2.2. To obtain the extrinsic parameters at the short focal
model is used in this paper. But, the proposed calibration method end, we calibrate extrinsic parameter 𝑅ℎ || 𝑇 ℎ at the telephoto end first.
3
Z. Gao et al. Measurement 185 (2021) 109999
Fig. 4. Field of view (FOV) for a stereo camera with respect to focal length.
Fig. 3. Verification experiment of Magill formula. The red dots represent the actual
calibration data, and the black lines represent the curves of the parametric equations
obtained from the first nine sets of data.
World coordinates at the short focal end are shifted by a fixed distance
in the optical axis direction compared with the telephoto end.
⎧𝑋 ′ = 𝑋
⎪ 𝑤′ 𝑤
⎨𝑌𝑤 = 𝑌𝑤 , (12) Fig. 5. Stereo camera field of view (FOV) with respect to focal length.
⎪𝑍𝑤 ′ = 𝑍𝑤 + (𝑓𝑙𝑜𝑛𝑔 − 𝑓𝑠ℎ𝑜𝑟𝑡 )
⎩
where 𝑓𝑙𝑜𝑛𝑔 is the image distance obtained when the telephoto end is
calibrated; and 𝑓𝑠ℎ𝑜𝑟𝑡 is the image distance obtained at the short focal 2. use 𝑅ℎ || 𝑇 ℎ to transform the world coordinate system to the right
length as shown in Section 2.2. camera principal point position for the telephoto end, and
The projection relationship conforms to (2) at the telephoto end 3. translate it to the principal point position for the short focal end
along the right camera optical axis.
with the left camera principal point coordinate system as the world
coordinate system. extrinsic parameter 𝑅ℎ does not change as the focal Solving (13),
length changes to the short focal end, but 𝑇 ℎ does. Thus, we only need
⎧𝑡 ′ = 𝑡 − 𝑅 (𝑓
to know the new 𝑇 ℎ . We use a small dot in the upper right corner to ⎪ 𝑥′ 𝑥 13 𝑙𝑜𝑛𝑔 − 𝑓𝑠ℎ𝑜𝑟𝑡 )𝑙
indicate the parameter at the short focal end. ⎨𝑡𝑦 = 𝑡𝑦 − 𝑅23 (𝑓𝑙𝑜𝑛𝑔 − 𝑓𝑠ℎ𝑜𝑟𝑡 )𝑟 (14)
⎪𝑡𝑧 ′ = 𝑡𝑧 − 𝑅33 (𝑓𝑙𝑜𝑛𝑔 − 𝑓𝑠ℎ𝑜𝑟𝑡 )𝑙 + (𝑓𝑙𝑜𝑛𝑔 − 𝑓𝑠ℎ𝑜𝑟𝑡 )𝑟
There is a new transformation matrix 𝑅ℎ || 𝑇 ℎ when the lens changes
′
⎩
from telephoto to short focal, describing the transition from the new
Thus, to calculate extrinsic parameters for the short focal end, we
left camera principal point coordinate system to the new right camera
need to know the image distance difference between the telephoto and
principal point coordinate system. The rotation-translation from the
short focal end, and extrinsic parameters calibrated at the telephoto
new left camera to the new right camera principal point coordinate
end. Image distance and extrinsic parameters at the telephoto end can
systems can be expressed as be obtained by conventional calibration, and image distance at the
⎡𝑋 ′ ⎤ short focal end as shown in Section 2.2.
⎡𝑅11 𝑅12 𝑅13 𝑡𝑥 ′ ⎤ ⎢ 𝑤′ ⎥ ⎡𝑅11 𝑅12 𝑅13 𝑡𝑥 ⎤ Thus, we have a process to calibrate extrinsic parameters for a
⎢𝑅 𝑌
𝑅22 𝑅23 𝑡𝑦 ′ ⎥ ⎢ 𝑤 ′ ⎥ = ⎢𝑅21 𝑅22 𝑅23 𝑡𝑦 ⎥ stereo camera using zoom lenses. We can obtain the complete intrin-
⎢ 21 ⎥ ⎢ 𝑍𝑤 ⎥ ⎢ ⎥
⎣𝑅31 𝑅32 𝑅33 𝑡𝑧 ⎦ ⎢
′
⎥ ⎣𝑅31 𝑅32 𝑅33 𝑡𝑧 ⎦ sic and extrinsic parameters following Section 2.2, and subsequently
⎣ 1 ⎦
calculate image coordinates for corresponding points in the left and
⎡ 𝑋𝑤 ′ ⎤
⎢ ⎥ right images using the 3D-DIC method. Since we then have the coor-
⎢ 𝑌𝑤 ′ ⎥
(13)
dinates for corresponding points in the left and right camera images
×
⎢𝑍𝑤 ′ − (𝑓𝑙𝑜𝑛𝑔 − 𝑓𝑠ℎ𝑜𝑟𝑡 )𝑙 ⎥ and complete intrinsic and extrinsic parameters, we can optimize the
⎢ 1 ⎥
⎣ ⎦ parameters by minimizing epipolar error or bundle adjustment (BA), as
⎡ 0 ⎤ convenient [32]. This paper optimized the parameters by minimizing
+⎢ 0 ⎥ epipolar error,
⎢ ⎥
⎣(𝑓𝑙𝑜𝑛𝑔 − 𝑓𝑠ℎ𝑜𝑟𝑡 )𝑟 ⎦ ∑
𝑛 ∑
𝑚
−1 ′
𝑎𝑟𝑔𝑚𝑖𝑛 (𝐴𝑟 ⋅ 𝑝𝑇𝑟 ⋅ 𝑅ℎ ) ⋅ (𝑇 ℎ × (𝐴−1
𝑙 ⋅ 𝑝𝑙 )), (15)
Fig. 5 shows that (13) can be understood as the following steps 𝑅,𝑇 ,𝐴 𝑖=1 𝑗=1
1. Translate the world coordinate system for the short focal end where 𝑛 is the number of stereo images; 𝑚 is the number of correspond-
along the 𝑍𝑤 direction to the principal point for the left camera ing points in each pair of stereo images; 𝑝𝑟 and 𝑝𝑙 are the corresponding
telephoto end, point coordinates for the left and right images, respectively; and 𝐴 is
4
Z. Gao et al. Measurement 185 (2021) 109999
Fig. 6. Intrinsic parameters calibration: (a) experimental setup; model curve and experimental parameter values for (b) left and (c) right cameras.
suitable reference with accurate size information since the FOV was
so large, hence we pasted some markers on the wall and measured
the distances between markers with a steel ruler, and subsequently
compared 3D reconstruction results with the measured distances.
Two 5 megapixel cameras (PointGrey, 3376 × 2704 pixel,
3.69 μm/pixel) were used, with 11–40 mm zoom lenses (Soyo ltd). The
measurement system was 7412.8 mm away from the object of interest,
and distance between cameras and object were measured by laser
rangefinder (ProsKit NT-85, distance accuracy = 1.5 mm). Distance
between the two cameras ≈ 1 m.
Fig. 6(a) shows the experimental setup for the intrinsic parameters
calibration experiment. The laser rangefinder was attached to the cam-
eras with optical axis parallel to the camera optical axis. The zoom lens
was adjusted to the short focal end. Specific calibration steps are as
follows:
Tables 2 and 3 show calibration results for left and right camera
set of intrinsic parameters for the cameras, which project the image intrinsic parameters with respect to imaging distance. 𝑐𝑥 and 𝑐𝑦 are
point 𝑝 onto the principal point coordinate system. The optimization of almost constant for the different imaging distances, consistent with our
(15) relies on the coordinates for corresponding points. Therefore, it is assumptions regarding optical center coordinate stability. Thus, we are
recommended that corresponding points occupy most of the FOV. justified to assume the optical center coordinates for the large FOV is
the same as that for the conventional FOV.
3. Experiment We then calculated 𝑑, 𝐹 , 𝛿𝑟−∞ , and 𝛿𝑟∞ by substituting the data
in Tables 2 and 3 into (7) and (11). Fig. 6(b) and (c) show the curve
We performed an intrinsic parameter calibration experiment to corresponding to (10) with known constant terms 𝛿𝑟−∞ and 𝛿𝑟∞ . The
verify the effectiveness of the intrinsic parameter calibration method proposed model was highly consistent with experimental data.
proposed in Section 2.2. The calibration results were then used to 3D Table 4 shows the final intrinsic parameters at this imaging distance
reconstruct a wall approximately 6 × 4 m. It was difficult to find from substituting imaging distance (7412.8 mm) into (5) and (8).
5
Z. Gao et al. Measurement 185 (2021) 109999
Fig. 8. Experimental images: (a1) and (a2) calibration board by left and right cameras, respectively, at telephoto end; (b1) and (b2) wall by left and right cameras, respectively,
at telephoto end; (c1) and (c2) wall by left and right cameras, respectively, at the short focal end.
Table 4
Parameters to be evaluated and intrinsic parameters for this imaging distance.
Parameter Left camera Right camera
𝑑 0.0032 0.0032
𝐹 13.11 mm 13.10 mm
𝛿𝑟∞ −0.228 −0.234
𝛿𝑟−∞ 1.072 0.714
𝑓𝑥 3559.34 3575.54
𝑓𝑦 3558.56 3577.66
𝑐𝑥 1676.07 1706.40
𝑐𝑦 1392.22 1395.19
𝑘1 −0.23016 −0.23506
𝑘2 0.128023 0.131281
Table 5
Fig. 9. Three-dimensional wall reconstruction. Extrinsic parameters at the telephoto and short focal ends.
𝛼 𝛽 𝛾 𝑇𝑥 𝑇𝑦 𝑇𝑧
Distance (mm) 𝑐𝑥 𝑐𝑦 𝑓𝑥 𝑓𝑦 𝑓𝑠 𝑘1 𝑘2
245 1674.741 1393.393 3628.885 3628.463 0.097 −0.282 0.139
337 1667.493 1391.764 3588.476 3587.972 0.264 −0.268 0.156
the experimental setup. Distance between the measurement system and
440 1667.781 1389.307 3575.88 3575.225 0.138 −0.261 0.134
714 1681.733 1393.017 3563.103 3562.838 0.111 −0.251 0.121
object of interest was measured by laser rangefinder (𝐷 = 7412.8 mm).
1056 1676.306 1393.37 3539.804 3539.729 −0.085 −0.244 0.131 The calibration board was with 12 × 9 at 90 mm spacing).
1596 1680.969 1392.756 3553.193 3553.261 −1.237 −0.237 0.12 Fig. 8(a1) and (a2) show the calibration board with camera zoom
2255 1683.437 1391.925 3553.019 3552.252 0.089 −0.234 0.133 lens at the telephoto end. Fig. 8(b1) and (b2) show wall images taken
at the telephoto end, and Fig. 8(c1) and (c2) show wall images taken
Table 3 at the short focal end. The FOV at the telephoto end is much smaller
Intrinsic and distortion parameters for the right camera. at the short focal end, and the calibration board occupies a larger FOV
Distance (mm) 𝑐𝑥 𝑐𝑦 𝑓𝑥 𝑓𝑦 𝑓𝑠 𝑘1 𝑘2 at the telephoto end.
247 1705.155 1396.109 3664.056 3663.472 −0.963 −0.285 0.131 Table 5 shows extrinsic parameters for the stereo camera at the
336 1709.024 1397.532 3610.142 3609.297 0.009 −0.267 0.131 telephoto end, obtained by conventional calibration methods. extrinsic
438 1706.941 1400.775 3601.898 3601.345 −0.139 −0.257 0.13 parameters in Table 5 were derived by substituting telephoto end
718 1709.252 1396.619 3584.231 3583.785 −0.175 −0.245 0.13
extrinsic parameters into (14). Parameters 𝛼, 𝛽, and 𝛾 in Table 5 refer to
1056 1708.131 1392.792 3577.053 3577.844 −0.286 −0.24 0.094
1599 1706.78 1392.488 3574.819 3575.103 −0.512 −0.237 0.122 the rotation angle between the coordinate systems. Rotation angle and
2233 1699.503 1389.984 3573.043 3574.829 3.363 −0.24 0.137 matrices can be derived from each other using the Rodriguez transform.
Finally, extrinsic parameters at the short focal end were combined with
intrinsic parameters at the short focal end, as discussed in Section 3.1.
Table 6 shows the optimal complete intrinsic and extrinsic parameters
3.2. Accuracy verification and 3D reconstruction
at the short focal end, obtained by minimizing epipolar error.
We attached five markers on the wall (Fig. 8(b1) and (b2)) to verify
After calibrating the intrinsic parameters of the camera, we per- 3D reconstruction accuracy using the proposed calibration method.
formed a complete stereo camera calibration, and reconstructed the Fig. 8(b1) shows the marker serial numbers. Distances between markers
wall topography (≈ 6 × 4 m) using the calibration results. Fig. 7 shows were measured using a steel ruler, with each distance measured five
6
Z. Gao et al. Measurement 185 (2021) 109999
Table 6 Table 7
Optimized intrinsic and extrinsic parameters. Distance between marker points by steel ruler and 3D reconstruction (mm).
Intrinsic parameters Extrinsic parameters 1 2 3 4 5 Average Reconstruction Difference
𝑐𝑥𝑙 = 1675.28 𝛼 = −1.907 0–2 457.9 458.0 457.8 458.2 458.0 457.98 457.85 −0.13
𝑐𝑦𝑙 = 1392.99 𝛽 = 6.077 1–2 592.2 592.1 592.2 592.0 592.1 592.12 590.73 −1.39
𝑐𝑥𝑟 = 1701.67 𝛾 = 0.505 2–3 593.9 594.0 594.1 593.9 594.0 593.98 594.44 0.46
𝑐𝑦𝑟 = 1395.49 𝑇𝑥 = −930.389 2–4 841.2 841.5 841.5 841.6 841.5 841.46 841.12 −0.34
𝑓𝑥𝑙 = 3559.30 𝑇𝑦 = −9.433 𝐷𝑐 4340 4336 4338 4337 4346 4337.4 4336.73 −0.67
𝑓𝑦𝑙 = 3558.51 𝑇𝑧 = 45.524 𝐷𝑏 1407 1406 1406 1407 1404 1406 1400.87 −5.13
𝑓𝑥𝑟 = 3558.52
𝑓𝑦𝑟 = 3557.82
𝑘1𝑙 = −0.209761
𝑘2𝑙 = 0.145816 proposed method. 3D reconstruction error was apparently random with
𝑘1𝑟 = −0.212564
approximately 1 mm magnitude for a FOV of ≈ 6 × 6 m.
𝑘2𝑟 = 0.162247
The main limitation for the proposed method is that the cameras
use zoom lenses, which could limit application scope. However, most
modern zoom lenses can adequately replace fixed focus lenses. The
times and averaged. In order to better show the reliability of our 3D proposed calibration method provides opportunities for high-precision
reconstruction results, we calculated the distances between the corners
measurement applications using stereo vision methods for large FOVs,
of the wall (represented by 𝐷𝑐) and the distance between the beam
such as materials science, civil engineering, automotive, and aerospace
and the wall (represented by 𝐷𝑏) and compared them with the results
engineering.
of a laser rangefinder. Table 7 compares measured and reconstructed
distances. In the in-plane direction, there was only approximately 1 mm
random error between the 3D reconstructed and physically measured CRediT authorship contribution statement
results. However, there is a systematic deviation of 5 mm (relative
error 0.35%) in the depth direction. This deviation may be due to the Zeren Gao: Conceptualization, Methodology, Software, Validation,
large depth difference between the beam and the wall. The distortion Investigation, Data curation, Writing - original draft, Writing – review
parameters used in the measurement system are calculated according & editing, Visualization. Yue Gao: Supervision. Yong Su: Supervi-
to the distance between the wall and the camera, which cannot well
sion. Yang Liu: Validation. Zheng Fang: Validation. Yaru Wang:
describe the distortion at the depth of the beam.
Validation. Qingchuan Zhang: Funding acquisition.
Figs. 9 and 10 show 3D reconstruction of the wall surface using the
calibrated system after verifying 3D reconstruction accuracy.
Declaration of competing interest
4. Conclusions
The authors declare that they have no known competing finan-
This paper proposed an easily implemented calibration method cial interests or personal relationships that could have appeared to
for large-FOV stereo vision that addressed several traditional calibra- influence the work reported in this paper.
tion method limitations. In particular, the proposed 3D-DIC method
used Magill’s formula to infer intrinsic parameters and zoom lenses
to change the FOV, allowing the calibration of extrinsic parameters Acknowledgments
with a conventional sized calibration board. Accurate stereo vision
calibration is an important prerequisite for accurate large-FOV stereo The authors thank their late colleague Xian Zhang for his help with
vision measurement. We also evaluated dimensional accuracy for the the experiment.
7
Z. Gao et al. Measurement 185 (2021) 109999
Funding [17] Y. Su, Q.C. Zhang, X.H. Xu, Z.R. Gao, Quality assessment of speckle patterns for
dic by consideration of both systematic errors and random errors, Opt. Lasers
Eng. 86 (2016) 132–142, http://dx.doi.org/10.1016/j.optlaseng.2016.05.019.
This work was supported by the National Natural Science Foun-
[18] Z.R. Gao, Q.C. Zhang, Y. Su, S.Q. Wu, Accuracy evaluation of optical distortion
dation of China (grants 11872354, 11627803), and the Strategic Pri- calibration by digital image correlation, Opt. Lasers Eng. 98 (2017) 143–152,
ority Research Program of the Chinese Academy of Sciences (grant http://dx.doi.org/10.1016/j.optlaseng.2017.06.008.
XDB22040502). [19] B. Pan, L.P. Yu, D.F. Wu, L.Q. Tang, Systematic errors in two-dimensional digital
image correlation due to lens distortion, Opt. Lasers Eng. 51 (2) (2013) 140–147,
http://dx.doi.org/10.1016/j.optlaseng.2012.08.012.
References [20] B. Pan, H.M. Xie, Z.Y. Wang, K.M. Qian, Z.Y. Wang, Study on subset size selection
in digital image correlation for speckle patterns, Opt. Express 16 (10) (2008)
[1] P. Luo, Y. Chao, M. Sutton, W.-H. Peters, Accurate measurement of three- 7037–7048, http://dx.doi.org/10.1364/Oe.16.007037.
dimensional deformations in deformable and rigid bodies using computer vision, [21] X.H. Xu, Y. Su, Y.L. Cai, T. Cheng, Q.C. Zhang, Effects of various shape functions
Exp. Mech. 33 (2) (1993) 123–132. and subset size in local deformation measurements using dic, Exp. Mech. 55 (8)
[2] I. Yamaguchi, A laser-speckle strain gauge, J. Phys. E: Sci. Instrum. 14 (11) (2015) 1575–1590, http://dx.doi.org/10.1007/s11340-015-0054-9.
(1981) 1270. [22] B. Pan, Digital image correlation for surface deformation measurement: historical
[3] W. Peters, W. Ranson, Digital imaging techniques in experimental stress analysis, developments, recent advances and future goals, Meas. Sci. Technol. 29 (8)
Opt. Eng. 21 (3) (1982) 213427. (2018) 082001, http://dx.doi.org/10.1088/1361-6501/aac55b.
[4] Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, Q. Zhang, High-efficiency and [23] X.X. Shao, X.J. Dai, Z.N. Chen, Y.T. Dai, S. Dong, X.Y. He, Calibration of stereo-
high-accuracy digital image correlation for three-dimensional measurement, Opt. digital image correlation for deformation measurement of large engineering
Lasers Eng. 65 (2015) 73–80. components, Meas. Sci. Technol. 27 (12) (2016) 125010, http://dx.doi.org/10.
[5] M.A. Sutton, F. Matta, D. Rizos, R. Ghorbani, S. Rajan, D.H. Mollenhauer, H.W. 1088/0957-0233/27/12/125010.
Schreier, A.O. Lasprilla, Recent progress in digital image correlation: Background [24] Z. Wang, Z. Wu, X. Zhen, R. Yang, J. Xi, X. Chen, A two-step calibration method
and developments since the 2013 w m murray lecture, Exp. Mech. 57 (1) (2017) of a large fov binocular stereovision sensor for onsite measurement, Measurement
1–30, http://dx.doi.org/10.1007/s11340-016-0233-3. 62 (2015) 15–24, http://dx.doi.org/10.1016/j.measurement.2014.10.037.
[6] H. Yan, B. Pan, Three-dimensional displacement measurement based on the [25] V. Srivastava, J. Baqersad, An optical-based technique to obtain operating
combination of digital holography and digital image correlation, Opt. Lett. 39 deflection shapes of structures with complex geometries, Mech. Syst. Signal
(17) (2014) 5166–5169, http://dx.doi.org/10.1364/Ol.39.005166. Process. 128 (2019) 69–81.
[7] Y. Xue, T. Cheng, X.H. Xu, Z.R. Gao, Q.Q. Li, X.J. Liu, X. Wang, R. Song, X.Y. [26] P. Poozesh, A. Sabato, A. Sarrafi, C. Niezrecki, P. Avitabile, R. Yarala, Multicam-
Ju, Q.C. Zhang, High-accuracy and real-time 3d positioning, tracking system for era measurement system to evaluate the dynamic response of utility-scale wind
medical imaging applications based on 3d digital image correlation, Opt. Lasers turbine blades, Wind Energy 23 (7) (2020) 1619–1639.
Eng. 88 (2017) 82–90, http://dx.doi.org/10.1016/j.optlaseng.2016.07.002. [27] D. Gorjup, J. Slavič, M. Boltežar, Frequency domain triangulation for full-field 3d
[8] Y.A. Xue, Y. Su, C. Zhan, X.H. Xu, Z.R. Gao, S.Q. Wu, Q.C. Zhang, X.P. Wu, Full- operating-deflection-shape identification, Mech. Syst. Signal Process. 133 (2019)
field wrist pulse signal acquisition and analysis by 3d digital image correlation, 106287.
Opt. Lasers Eng. 98 (2017) 76–82, http://dx.doi.org/10.1016/j.optlaseng.2017. [28] Z.Y. Zhang, A flexible new technique for camera calibration, IEEE Trans. Pattern
05.018. Anal. Mach. Intell. 22 (11) (2000) 1330–1334, http://dx.doi.org/10.1109/34.
[9] M.T. Begonia, M. Dallas, B. Vizcarra, Y. Liu, M.L. Johnson, G. Thiagarajan, Non- 888718.
contact strain measurement in the mouse forearm loading model using digital [29] R.Y. Tsai, A versatile camera calibration technique for high-accuracy 3d machine
image correlation (dic), Bone 81 (2015) 593–601, http://dx.doi.org/10.1016/j. vision metrology using off-the-shelf tv cameras and lenses, Ieee J. Robot. Autom.
bone.2015.09.007. 3 (4) (1987) 323–344, http://dx.doi.org/10.1109/jra.1987.1087109.
[10] Z.R. Gao, F.J. Li, Y. Liu, T. Cheng, Y. Su, Z. Fang, M. Yang, Y. Li, J. Yu, Q.C. [30] K. Genovese, Y.X. Chi, B. Pan, Stereo-camera calibration for large-scale dic
Zhang, Tunnel contour detection during construction based on digital image measurements with active phase targets and planar mirrors, Opt. Express 27
correlation, Opt. Lasers Eng. 126 (2020) 105879, http://dx.doi.org/10.1016/j. (6) (2019) 9040–9053, http://dx.doi.org/10.1364/Oe.27.009040.
optlaseng.2019.105879. [31] Z.Z. Xiao, L. Jin, D.H. Yu, Z.Z. Tang, A cross-target-based accurate calibration
[11] Y. Su, Q.C. Zhang, Z.R. Gao, X.H. Xu, X.P. Wu, Fourier-based interpolation method of binocular stereo systems with large-scale field-of-view, Measurement
bias prediction in digital image correlation, Opt. Express 23 (15) (2015) 43 (6) (2010) 747–754, http://dx.doi.org/10.1016/j.measurement.2010.01.017.
19242–19260, http://dx.doi.org/10.1364/Oe.23.019242. [32] Z.L. Su, L. Lu, S. Dong, F.J. Yang, X.Y. He, Auto-calibration and real-time external
[12] Y. Su, Q.C. Zhang, X.H. Xu, Z.R. Gao, S.Q. Wu, Interpolation bias for the inverse parameter correction for stereo digital image correlation, Opt. Lasers Eng. 121
compositional gauss-newton algorithm in digital image correlation, Opt. Lasers (2019) 46–53, http://dx.doi.org/10.1016/j.optlaseng.2019.03.018.
Eng. 100 (2018) 267–278, http://dx.doi.org/10.1016/j.optlaseng.2017.09.013. [33] Z. Liu, F.J. Li, X.J. Li, G.J. Zhang, A novel and accurate calibration method for
[13] Z.R. Gao, X.H. Xu, Y. Su, Q.C. Zhang, Experimental analysis of image noise and cameras with large field of view using combined small targets, Measurement 64
interpolation bias in digital image correlation, Opt. Lasers Eng. 81 (2016) 46–53, (2015) 1–16, http://dx.doi.org/10.1016/j.measurement.2014.11.027.
http://dx.doi.org/10.1016/j.optlaseng.2016.01.002. [34] A. Sabato, N.A. Valente, C. Niezrecki, Development of a camera localization
[14] Y. Su, Q.C. Zhang, Z.R. Gao, X.H. Xu, Noise-induced bias for convolution-based system for three-dimensional digital image correlation camera triangulation, IEEE
interpolation in digital image correlation, Opt. Express 24 (2) (2016) 1175–1195, Sens. J. 20 (19) (2020) 11518–11526.
http://dx.doi.org/10.1364/Oe.24.001175. [35] W. Feng, Z. Su, Y. Han, H. Liu, Q. Yu, S. Liu, D. Zhang, Inertial measurement
[15] Y. Su, Z.R. Gao, Z. Fang, Y. Liu, Y.R. Wang, Q.C. Zhang, S.Q. Wu, Theoret- unit aided extrinsic parameters calibration for stereo vision systems, Opt. Lasers
ical analysis on performance of digital speckle pattern: uniqueness, accuracy, Eng. 134 (2020) 106252.
precision, and spatial resolution, Opt. Express 27 (16) (2019) 22439–22474, [36] A.A. Magill, Variation in distortion with magnification, J. Opt. Soc. Amer. 45
http://dx.doi.org/10.1364/Oe.27.022439. (3) (1955) 148–151, http://dx.doi.org/10.1364/Josa.45.000148.
[16] Y. Su, Q.C. Zhang, Z.R. Gao, Statistical model for speckle pattern optimization,
Opt. Express 25 (24) (2017) 30259–30275, http://dx.doi.org/10.1364/Oe.25.
030259.