Sensors 22 08446
Sensors 22 08446
Article
A Novel Point Set Registration-Based Hand–Eye Calibration
Method for Robot-Assisted Surgery
Wenyuan Sun † , Jihao Liu † , Yuyun Zhao and Guoyan Zheng *
Institute of Medical Robotics, School of Medical Engineering, Shanghai Jiao Tong University,
Shanghai 200240, China
* Correspondence: [email protected]
† These authors contributed equally to this work.
Abstract: Pedicle screw insertion with robot assistance dramatically improves surgical accuracy and
safety when compared with manual implantation. In developing such a system, hand–eye calibration
is an essential component that aims to determine the transformation between a position tracking
and robot-arm systems. In this paper, we propose an effective hand–eye calibration method, namely
registration-based hand–eye calibration (RHC), which estimates the calibration transformation via
point set registration without the need to solve the AX = XB equation. Our hand–eye calibration
method consists of tool-tip pivot calibrations in two-coordinate systems, in addition to paired-
point matching, where the point pairs are generated via the steady movement of the robot arm in
space. After calibration, our system allows for robot-assisted, image-guided pedicle screw insertion.
Comprehensive experiments are conducted to verify the efficacy of the proposed hand–eye calibration
method. A mean distance deviation of 0.70 mm and a mean angular deviation of 0.68◦ are achieved
by our system when the proposed hand–eye calibration method is used. Further experiments on
drilling trajectories are conducted on plastic vertebrae as well as pig vertebrae. A mean distance
deviation of 1.01 mm and a mean angular deviation of 1.11◦ are observed when the drilled trajectories
are compared with the planned trajectories on the pig vertebrae.
Citation: Sun, W.; Liu, J.; Zhao, Y.;
Zheng, G. A Novel Point Set
Keywords: hand–eye calibration; robot-assisted surgery; pedicle screw insertion; paired-point matching
Registration-Based Hand–Eye
Calibration Method for
Robot-Assisted Surgery. Sensors 2022,
22, 8446. https://doi.org/10.3390/
s22218446 1. Introduction
Pedicle screw insertion is an effective treatment of spinal diseases, such as scoliosis,
Academic Editor: Anna Eva Morabito
in addition to spinal fracture and vertebral injury. Manual implantation is challenging,
Received: 16 October 2022 especially in patients with severe spinal deformity, osteoporosis, or tumor [1–3]. To address
Accepted: 31 October 2022 the challenge, one of the proposed technologies is to integrate a robot arm with a computer
Published: 3 November 2022 navigation system [4–7]. In developing such a system, hand–eye calibration is an essential
Publisher’s Note: MDPI stays neutral component, which aims to determine the homogeneous transformation between the robot
with regard to jurisdictional claims in hand/end-effector and the optical frame affixed to the end-effector [8,9].
published maps and institutional affil- Due to its importance, a number of approaches have been developed to solve the
iations. problem. Hand–eye calibration can be formulated in the form of AX = XB, where A
and B are the robotic end-effector and the optical frame poses between successive time
frames, respectively, and X is the unknown transformation matrix between the robot end-
effector and the optical frame. Many solutions have been proposed to recover X-given
Copyright: © 2022 by the authors. data streams {Ai } and {Bi }. Solutions to the problem can be roughly classified into
Licensee MDPI, Basel, Switzerland. four categories, i.e., separable solutions [10–14], simultaneous solutions [15–17], iterative
This article is an open access article
solutions [8,18–21], and probabilistic methods [22,23]. Specifically, given equations A and
distributed under the terms and
B, it is possible to decompose the equation into rotational and translational parts. Separable
conditions of the Creative Commons
solutions utilize this property to solve hand–eye calibration, where the rotation part is first
Attribution (CC BY) license (https://
solved, followed by solving the translational part. In contrast, simultaneous solutions solve
creativecommons.org/licenses/by/
the rotational and translational parts at the same time. Methods in the third category solve
4.0/).
2. Related Works
Huge amounts of time have been devoted to solve the problem of hand–eye calibration.
Due to the wide applications of robot-assisted procedures, different types of methods
have been developed for increased accuracy and robustness. Existing solutions can be
roughly classified into four categories, as shown in Table 1, i.e., separable closed-form
solutions [10–14], simultaneous closed-form solutions [15–17], iterative solutions [18–20],
and probabilistic methods [22,23].
The earliest approaches separately estimated the rotational and translational parts.
For example, Shiu et al. proposed a method for solving homogeneous transform equa-
tions [10]. Tsai presented an efficient 3D robotics hand–eye calibration algorithm that
computed 3D position and orientation separably [11]. Quaternion-based [13], extrinsic
hand–eye calibration [12], and dual-quaternions-based calibration methods [14] have been
introduced for the individual estimations of rotational and translational parts. One known
problem with separable methods is that any error in the estimation of the rotation matrices
may be propagated to the estimation of the translation vector.
Sensors 2022, 22, 8446 3 of 16
To avoid the error propagation problem with separable solutions, methods in the
second category simultaneously compute the orientation and position. For example, Lu et al.
proposed an approach that transformed the kinematic equation into linear systems using
normalized quaternions [16]. Andreff et al. proposed an on-line hand–eye calibration
method that derived a linear formulation of the problem [15]. Zhao et al. [17] proposed a
hand–eye calibration method based on screw motion theory to establish linear equations
and simultaneously solve rotation and translation. As confirmed by experimental results,
simultaneous methods have less error than separable solutions [25].
Iterative solutions are another type of method used to solve the problem of error
propagation. For example, Zhuang et al. [18] presented an iterative algorithm to solve the
unknown matrix X in one stage, thus eliminating error propagation and improving noise
sensitivity. Mao et al. [20] proposed using a direct linear closed-form solution followed
by Jacobian optimization to solve AX = XB for hand–eye calibration. Hirsh et al. [26]
proposed a robust iterative method to simultaneously estimate both the hand–eye and
robot–world spatial transformation. Based on a metric defined on the group of the rigid
transformation SE(3), Strobl and Hirzinger [27] presented an error model for nonlinear
optimization. They then proposed a calibration method for estimating both the hand–eye
and robot–world transformations. While iterative solutions are generally accurate, they can
be computationally expensive and may not always converge to the optimal solution [28].
The methods mentioned above assume an exact correspondence between the streams
of sensor data, while methods in the fourth category eliminate such a requirement. For ex-
ample, Ma et al. [23] proposed two probabilistic approaches by giving new definitions of
the mean on SE(3), which alleviated the restrictions on the dataset and led to improved
accuracy. Although it is worth investigating the situation when the exact correspondence
between sensor data is unknown, probabilistic methods usually lead to longer computation
times. Additionally, assuming an exact correspondence is not a problem in our study.
Hand–eye calibration is also an active research topic in medical applications. For ex-
ample, Morgan et al. [29] presented a Procrustean perspective-n-point (PnP) solution for
hand–eye calibration for surgical cameras, achieving an average projection error of 12.99
pixels when evaluated on a surgical laparoscope. Özgüner et al. [30] proposed a solution
for hand–eye calibration for the da Vinci robotic surgical system by breaking down the
calibration procedure into systematic steps to reduce error accumulation. They reported
a root mean square (RMS) error of 2.1 mm and a mean rotational error of 3.2◦ when their
calibration method was used to produce visually-guided end-effector motions. Using
the da Vinci Research Kit (dVRK) and an RGB-D camera, Roberti et al. [31] proposed to
separate the calibration of the robotic arms and an endoscope camera manipulator from the
hand–eye calibration of the camera for an improved accuracy in a 3D metric space. The pro-
posed method reached a sub-millimeter accuracy in a dual-arm manipulation scenario,
Sensors 2022, 22, 8446 4 of 16
while the use of the RGB-D camera limited its actual application in surgery. Sun et al. [32]
proposed a hand–eye calibration method for robot-assisted minimally invasive surgery,
which relied purely on surgical instruments already in the operating scenario. Their model
was formed by the geometry information of the surgical instrument and the remote center-
of-motion (RCM) constraint, outperforming traditional hand–eye calibration methods in
both simulation and robot experiments.
Deep learning-based methods, especially those based on convolutional neural net-
works (CNN), have also been developed for low-level image-processing tasks in hand–eye
calibration [33–36]. For example, Valassakis et al. [34] proposed a sparse correspondence
model that used a U-Net to detect 2D key points for eye-in-hand camera calibration.
Kim et al. [36] introduced deep learning-based methods to restore out-of-focus blurred
images for an improved accuracy in hand–eye calibration.
Figure 1. The involved coordinate systems for robot-assisted, image-guided pedicle screw insertion.
During a pedicle screw insertion procedure, the pose of the guide is adjusted to align with a trajectory,
which is planned in a pre-operative CT first, and then is transformed to the patient space via a
surface registration.
Sensors 2022, 22, 8446 5 of 16
(CM R)1
C
−I −( M t )1
.. .. p M = ..
(1)
. . p .
C
(CM R)n −I −(CM t )n
where I is the 3 × 3 identity matrix.
C
( M R )1 − I −(CM t )1
Defining R̃ = .. .. , and t̃ = ..
, we have:
. . .
C
( M R)n − I C
−( M t )n
pM
R̃ = t̃ (2)
pC
Then, we can solve p M and pC using pseudo-inverse [39]:
pM
= (( R̃)T ( R̃))−1 ( R̃)T t̃ (3)
pC
Sensors 2022, 22, 8446 6 of 16
As we are only interested in knowing the offset of the tool tip with respect to O M , we
keep p M while disregarding pC .
Similarly, we can use the same pivot calibration technique to estimate the coordinates
of the tool tip in both the robotic flange COS OF and the robot base COS OB . This time, we
pivoted the tool tip around a stationary point, as shown in Figure 2c. Similarly, we placed
the tool tip in a divot to avoid sliding. We denoted, respectively, the two coordinates as p B
and p F . During pivoting, we kept p B and p F static while collecting a set of l homogeneous
transformations {( BF T )i = (( BF R)i , ( BF t )i ); 1 6 i 6 l } via the robot arm API. Then, we
estimated p B and p F by solving following the following overdetermined equations [39]:
( BF R)1
B
−I −( F t )1
.. .. p F = .. (4)
. . p .
B
( BF R)l −I −( BF t )l
where I is the 3 × 3 identity matrix.
Again, we are only interested in knowing the offset of the tool tip with respect to OF ;
therefore, we kept p F while disregarding p B .
Figure 2. Tool-tip pivot calibration. (a) The calibration tool with a sharp tip is rigidly fixed to the
flange during the hand–eye calibration; (b) pivot calibration of the offset of the tool tip with respect
to the 3D COS O M of the optical reference frame on the end-effector; (c) pivot calibration of the offset
of the tool tip with respect to the robotic flange COS OF .
which are the coordinates of the tool tip measured in the robot base COS OB via
( p B )i = ( BF T )i p F . Therefore, we can solve the spatial transformation CB T using a paired-
point matching algorithm.
Sensors 2022, 22, 8446 7 of 16
Figure 3. Solving C
B T via paired-point matching. By controlling the flange to move in m different
positions, we can obtain the coordinates of the tool tip in both the optical tracking camera COS OC
and the robot base COS OB , generating two point sets. C B T is solved by matching the two point sets
using a paired-point matching algorithm.
For the first step to match two paired-point sets, we computed a 3 × 3 matrix H
as follows:
m
1 m 1 m
H = ∑ (( p B ) j − ( ∑ ( p B )i )) · (( pC ) j − ( ∑ ( pC )i )) T (5)
j =1
m i =1 m i =1
We then used the singular value decomposition (SVD) [39] to decompose matrix H
into U, S, and V matrices:
H = USV T (6)
Based on the decomposed matrices, we computed the rotation matrix CB R as:
1 0 0
C
BR = V 0
1 0 U T (7)
0 0 λ
where λ = det(UV ).
Based on CB R, we solved CB t using:
m m
1 1
C
Bt =
m ∑ ( pC )i − CB R m ∑ ( pB )i (8)
i =1 i =1
C
BT = (CB R, CB t ) (9)
For each position in the movement trajectory, we computed the spatial transformation
( FM T )i as:
( FM T )i = ( BF T )i−1 · (CB T )−1 · (CM T )i (10)
Sensors 2022, 22, 8446 8 of 16
where ( BF T )i and (CM T )i are retrieved from the associated device’s API when generating PC
and P B .
Each position will give a different ( FM T )i . To improve the robustness and to increase
the accuracy, we averaged all the obtained transformations. Specifically, we used (ψi , θi , φi )
to represent the Euler angles of ( FM R)i , so the average rotation matrix FM R can be written as:
m m m
1 1 1
F
MR = R(
m ∑ ψi , m ∑ θi , m ∑ φi ) (11)
i =1 i =1 i =1
where R() represents the transformation from the Euler angles to the rotation matrix.
Meanwhile, the average translation vector FM t can be written as:
m
1
F
Mt =
m ∑ (FM t )i (12)
i =1
F
MT = ( FM R,FM t ) (13)
Figure 4. A schematic view the guiding tube calibration. (a) The plug, which can be inserted into
the guiding tube from both ends for digitization. (b) The three points on the tube that are digitized
and the COS OT of the guiding tube established using the three points.
Sensors 2022, 22, 8446 9 of 16
To establish the COS OT , we defined the origin by p(2) , the z-axis by p(1) and p(2) , and
determined the three points by the x-z plane. We obtained the transformation TM T by its
origin and axes, as:
(x) (y) (z)
rM rM rM (2)
M kr ( x ) k kr (y ) k kr (z ) k p M
T T = M M M (14)
0 0 0 1
where,
(x) (3) (2) (1) (2) (1) (2)
r = (( p M − p M ) × ( p M − p M )) × ( p M − p M )
M
(y) (3) (2) (1) (2)
r M = ( pM − pM ) × ( pM − pM ) (15)
(z)
(1) (2)
r M = pM − pM
Figure 5. A schematic view of robot-assisted pedicle screw insertion: The target trajectory is planned
in the COS OCT and transformed to the COS OR by intra-operative registration. The target trajectory
is further transformed to the robot base COS OB . The guiding tube is aligned with the target trajectory
for insertion guidance.
R T. Based on R T,
the surface, we adopted a surface registration algorithm [37] to solve CT CT
(e) (t)
pCT and pCT can be transformed to the COS OR .
3.4.3. Transforming the Planned Trajectory to the Robot Base COS and Aligning the
Guiding Tube with the Transformed Trajectory
In the third step, we transformed the planned trajectory to the robot base COS OB
so that the robot can align the guiding tube with the transformed trajectory, which is
calculated as:
h i h i
(e) (t) B F C −1 C R (e) (t)
pB p B = F T · M T · ( M T ) · R T · CT T pCT pCT (16)
In Equation (16), we retrieved CR T and CM T from the optical tracking camera’s API. FM T
is the hand–eye transformation. We retrieved BF T from the robot arm’s API.
4.1. Metrics
In the experiments, the performance is quantified by the deviations between the actual
path and the planned trajectory. The deviations consist of the incline angle (unit: ◦ ) and
distance deviation (unit: mm). We used the entry point p(e) and the target point p(t) on the
planned trajectory to measure the distance, as shown in Figure 6. The distance deviation
and incline angle between the guidance path and the planned trajectory are denoted as d
and φ, respectively, while the distance deviation and the incline angle between the drilled
path and the planned trajectory are denoted as d0 and φ0 , respectively.
Figure 6. Metrics used to evaluate the accuracy in this study, including distance deviation d and d0 ,
as well as incline angle φ and φ0 .
4.2. Investigation of the Influence of the Range of Robot Movement to the Hand–Eye Calibration
In this experiment, we investigated the influence of the spatial range of robot move-
ment to the proposed RHC. In the experiment, a plastic phantom was designed and used,
Sensors 2022, 22, 8446 11 of 16
as shown in Figure 7a. The phantom, which was fabricated by 3D printing, had a dimension
of 140 × 90× 85 mm3 , and 25 trajectories were planned on the phantom.
During the hand–eye calibration, the robot is controlled to move in an L × L × L mm3
cubic space. To investigate the influence of the range of robot movement, we calibrated
different hand–eye transformation matrices with an L of 30, 60, 90, 120, 150, or 200 mm.
Each time, after obtaining hand–eye calibration, we used the obtained transformation to
control the robot to align the guiding tube with a planned trajectory. After that, we digitized
the guidance path to evaluate the alignment accuracy.
Experimental results are shown in Figure 7 and Table 2. Both d and φ decreased when
L increased. When L was 200 mm, the mean distance deviation was 0.70 mm and the mean
incline angle was 0.68◦ . The results demonstrate that the larger the robot movement range,
the higher the hand–eye calibration accuracy. However, further increasing the movement
range will lead to a failure in tracking by the camera. We found that the maximally allowed
robot movement range is 200 × 200× 200 mm3 .
Figure 7. Investigation of the influence of the range of robot movement to the hand–eye calibra-
tion: (a) the plastic phantom used in the experiment; (b) the box plots of distance deviation and
incline angle.
Table 2. Investigation of the influence of the range of robot movement to the hand–eye calibration.
d [mm] φ[◦ ]
L [mm] Mean Max. Mean Max.
30 1.17 1.40 0.87 1.25
60 0.86 1.09 0.83 0.93
90 0.82 0.95 0.72 0.91
120 0.86 1.06 0.75 0.90
150 0.71 1.11 0.70 0.85
200 0.70 0.88 0.68 0.96
Figure 8. Comparison of our method with the SOTA hand–eye calibration methods.
Table 3. Comparison of our method with the SOTA hand–eye calibration methods.
Figure 9. Overall system accuracy study: (a) the 3D-printed vertebra and pig vertebrae; (b) the CT
image of the animal vertebrae after drilling; (c) the box plots of distance deviation and incline angle.
3D-Printed
Plastic Phantom Pig Vertebrae
Vertebrae
Mean 0.70 0.66 0.71
d [mm]
Max. 0.85 0.79 0.82
Mean 0.93 0.90 1.01
d0 [mm]
Max. 1.15 1.13 1.52
Mean 0.72 0.79 0.82
φ[◦ ]
Max. 0.94 0.91 0.96
Mean 1.04 0.96 1.11
φ0 [◦ ]
Max. 1.45 1.24 1.38
5. Discussions
Hand–eye calibration is one of the essential components when developing a robot-
assisted, image-guided pedicle screw insertion system. The accuracy of hand–eye cali-
bration will have a direct influence on the system accuracy. However, it is challenging
to develop an accurate and robust method for hand–eye calibration. In this paper, we
proposed an effective hand–eye calibration method based on tool-tip pivot calibration and
paired-point matching without the need to solve the AX = XB equation. Comprehensive
experiments were conducted to validate the accuracy of our proposed hand–eye calibration
method as well as the robot-assisted, image-guided pedicle screw insertion system. Both
qualitative and quantitative results demonstrate the efficacy of our hand–eye calibration
method and the high accuracy of our system.
In comparison with a SOTA hand–eye calibration method, our method has the fol-
lowing advantages: First, our method is a simultaneous closed-form solution, which is
derived by solving three overdetermined equations, guaranteeing an optimal solution.
Second, unlike other simultaneous solutions, we reformulate the hand–eye calibration
problem as solutions to tool-tip pivot calibrations in two-coordinate systems and paired-
point matching, taking advantage of the steady movement of the robot arm, thus reducing
measurement errors and noise. Third, in comparison with methods depending on iterative
Sensors 2022, 22, 8446 14 of 16
solutions [18–21] or probabilistic models [22,23], our method is much faster because it is
not an iterative solution and only requires simple matrix operations.
Based on the novel hand–eye calibration method, we further developed a robot-
assisted, image-guided pedicle screw insertion system. We conducted trajectory drilling
experiments on a plastic phantom, 3D-printed vertebrae, and pig vertebrae to validate
the accuracy of our system. When drilling trajectories on the plastic phantom, our system
achieved a mean distance deviation of 0.93 mm and a mean angular deviation of 1.04◦ .
When it was used to drill trajectories on the 3D-printed vertebrae, our system achieved
a mean distance deviation of 0.90 mm and a mean angular deviation of 0.96◦ . To check
whether the differences between results obtained from the plastic phantom and the 3D-
printed vertebrae are statistically significant, we conducted an unpaired t-test and chose a
significant level of α = 0.05. We found a p-value of 0.52 for the distance deviation and a
p-value of 0.40 for the angular deviation. When drilling trajectories on the pig vertebrae,
our system achieved a mean distance deviation of 1.01 mm and a mean angular deviation
of 1.11◦ , which are regarded accurate enough for pedicle screw insertion.
6. Conclusions
In this paper, we proposed a novel hand–eye calibration method, namely registration-
based hand–eye calibration (RHC), to estimate the calibration transformation via paired-
point matching without the need to solve the AX = XB equation. Based on the proposed
hand–eye calibration method, we developed a robot-assisted, image-guided pedicle screw
insertion system. Comprehensive experiments were conducted to investigate the influence
of the range of robot movement on the hand–eye calibration to compare our method with
state-of-the-art methods and to evaluate overall system accuracy. Our experimental results
demonstrate the efficacy of our hand–eye calibration method and the high accuracy of
our system. Our novel hand–eye calibration method can be applied to other types of
robot-assisted surgery.
Author Contributions: Conceptualization, G.Z.; Data curation, W.S., J.L. and Y.Z.; Formal analysis,
W.S.; Funding acquisition, G.Z.; Investigation, W.S., J.L. and Y.Z.; Methodology, W.S. and G.Z.; Project
administration, G.Z.; Software, W.S.; Supervision, G.Z.; Validation, J.L. and Y.Z.; Visualization, W.S.;
Writing—original draft, W.S.; Writing—review & editing, G.Z. All authors have read and agreed to
the published version of the manuscript.
Funding: This research was partially supported by the Shanghai Municipal Science and Technology
Commission (20511105205) and by the National Natural Science Foundation of China (U20A20199).
Institutional Review Board Statement: The study was conducted in accordance with the Declaration
of Helsinki, and approved by the Institutional Review Board of School of Biomedical Engineering,
Shanghai Jiao Tong University, China (Approval No. 2020031, approved on 8 May 2020).
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.
Abbreviations
The following abbreviations are used in this manuscript:
CT Computed Tomography
API Application Programming Interface
COS Coordinate System
SVD Singular Value Decomposition
RHC Registration-based Hand–eye Calibration
3D Three-dimension
Sensors 2022, 22, 8446 15 of 16
References
1. Tian, N.F.; Xu, H.Z. Image-guided pedicle screw insertion accuracy: A meta-analysis. Int. Orthop. 2009, 33, 895–903. [CrossRef]
[PubMed]
2. Fan, Y.; Du, J.; Zhang, J.; Liu, S.; Xue, X.; Huang, Y.; Zhang, J.; Hao, D. Comparison of accuracy of pedicle screw insertion among
4 guided technologies in spine surgery. Med. Sci. Monit. Int. Med. J. Exp. Clin. Res. 2017, 23, 5960. [CrossRef] [PubMed]
3. Nguyen, N.Q.; Priola, S.M.; Ramjist, J.M.; Guha, D.; Dobashi, Y.; Lee, K.; Lu, M.; Androutsos, D.; Yang, V. Machine vision
augmented reality for pedicle screw insertion during spine surgery. J. Clin. Neurosci. 2020, 72, 350–356. [CrossRef] [PubMed]
4. Solomiichuk, V.; Fleischhammer, J.; Molliqaj, G.; Warda, J.; Alaid, A.; von Eckardstein, K.; Schaller, K.; Tessitore, E.; Rohde,
V.; Schatlo, B. Robotic versus fluoroscopy-guided pedicle screw insertion for metastatic spinal disease: A matched-cohort
comparison. Neurosurg. Focus 2017, 42, E13. [CrossRef] [PubMed]
5. Molliqaj, G.; Schatlo, B.; Alaid, A.; Solomiichuk, V.; Rohde, V.; Schaller, K.; Tessitore, E. Accuracy of robot-guided versus
freehand fluoroscopy-assisted pedicle screw insertion in thoracolumbar spinal surgery. Neurosurg. Focus 2017, 42, E14. [CrossRef]
[PubMed]
6. Kim, H.J.; Jung, W.I.; Chang, B.S.; Lee, C.K.; Kang, K.T.; Yeom, J.S. A prospective, randomized, controlled trial of robot-assisted vs
freehand pedicle screw fixation in spine surgery. Int. J. Med. Robot. Comput. Assist. Surg. 2017, 13, e1779. [CrossRef]
7. Shaw, K.A.; Murphy, J.S.; Devito, D.P. Accuracy of robot-assisted pedicle screw insertion in adolescent idiopathic scoliosis: Is
triggered electromyographic pedicle screw stimulation necessary? J. Spine Surg. 2018, 4, 187. [CrossRef]
8. Wu, L.; Ren, H. Finding the kinematic base frame of a robot by hand-eye calibration using 3D position data. IEEE Trans. Autom.
Sci. Eng. 2016, 14, 314–324. [CrossRef]
9. Liu, G.; Yu, X.; Li, C.; Li, G.; Zhang, X.; Li, L. Space calibration of the cranial and maxillofacial robotic system in surgery. Comput.
Assist. Surg. 2016, 21, 54–60. [CrossRef]
10. Shiu, Y.C.; Ahmad, S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form
AX= XB. IEEE Trans. Robot. Autom. 1989, 5, 16–29. [CrossRef]
11. Tsai, R.Y.; Lenz, R.K. A new technique for fully autonomous and efficient 3 d robotics hand/eye calibration. IEEE Trans. Robot.
Autom. 1989, 5, 345–358. [CrossRef]
12. Wang, C.C. Extrinsic calibration of a vision sensor mounted on a robot. Ieee Trans. Robot. Autom. 1992, 8, 161–175. [CrossRef]
13. Chou, J.C.; Kamel, M. Finding the position and orientation of a sensor on a robot manipulator using quaternions. Int. J. Robot.
Res. 1991, 10, 240–254. [CrossRef]
14. Daniilidis, K. Hand-eye calibration using dual quaternions. Int. J. Robot. Res. 1999, 18, 286–298. [CrossRef]
15. Andreff, N.; Horaud, R.; Espiau, B. On-line hand-eye calibration. In Proceedings of the Second International Conference on 3-D
Digital Imaging and Modeling (Cat. No. PR00062), Ottawa, ON, Canada, 8 October 1999; pp. 430–436.
16. Lu, Y.C.; Chou, J.C. Eight-space quaternion approach for robotic hand-eye calibration. In Proceedings of the 1995 IEEE
International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century, Vancouver, BC, Canada,
22–25 October 1995; Volume 4, pp. 3316–3321.
17. Zhao, Z.; Liu, Y. Hand-eye calibration based on screw motions. In Proceedings of the 18th International Conference on Pattern
Recognition (ICPR’06), Hong Kong, China, 20–24 August 2006; Volume 3, pp. 1022–1026.
18. Zhuang, H.; Shiu, Y.C. A noise-tolerant algorithm for robotic hand-eye calibration with or without sensor orientation measurement.
IEEE Trans. Syst. Man, Cybern. 1993, 23, 1168–1175. [CrossRef]
19. Wei, G.Q.; Arbter, K.; Hirzinger, G. Active self-calibration of robotic eyes and hand-eye relationships with model identification.
IEEE Trans. Robot. Autom. 1998, 14, 158–166.
20. Mao, J.; Huang, X.; Jiang, L. A flexible solution to AX= XB for robot hand-eye calibration. In Proceedings of the 10th WSEAS
International Conference on Robotics, Control and Manufacturing Technology, Hangzhou, China, 11–13 April 2010; pp. 118–122.
21. Zhang, Z.; Zhang, L.; Yang, G.Z. A computationally efficient method for hand–eye calibration. Int. J. Comput. Assist. Radiol. Surg.
2017, 12, 1775–1787. [CrossRef]
22. Li, H.; Ma, Q.; Wang, T.; Chirikjian, G.S. Simultaneous hand-eye and robot-world calibration by solving the AX = YB problem
without correspondence. IEEE Robot. Autom. Lett. 2015, 1, 145–152. [CrossRef]
23. Ma, Q.; Li, H.; Chirikjian, G.S. New probabilistic approaches to the AX= XB hand-eye calibration without correspondence. In
Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May
2016; pp. 4365–4371.
24. Aiguo, L.; Lin, W.; Defeng, W. Simultaneous robot-world and hand-eye calibration using dual-quaternions and Kronecker
product. Int. J. Phys. Sci. 2010, 5, 1530–1536.
25. Ali, I.; Suominen, O.; Gotchev, A.; Morales, E.R. Methods for simultaneous robot-world-hand–eye calibration: A comparative
study. Sensors 2019, 19, 2837. [CrossRef]
26. Hirsh, R.L.; DeSouza, G.N.; Kak, A.C. An iterative approach to the hand-eye and base-world calibration problem. In Proceedings
of the 2001 ICRA, IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164), Seoul, Korea, 21–26 May
2001; Volume 3, pp. 2171–2176.
27. Strobl, K.H.; Hirzinger, G. Optimal hand-eye calibration. In Proceedings of the 2006 IEEE/RSJ International Conference on
Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 4647–4653.
Sensors 2022, 22, 8446 16 of 16
28. Shah, M.; Eastman, R.D.; Hong, T. An overview of robot-sensor calibration methods for evaluation of perception systems. In
Proceedings of the Workshop on Performance Metrics for Intelligent Systems, Gaithersburg, MD, USA, 19–21 August 2012;
pp. 15–20.
29. Morgan, I.; Jayarathne, U.; Rankin, A.; Peters, T.M.; Chen, E. Hand-eye calibration for surgical cameras: A procrustean
perspective-n-point solution. Int. J. Comput. Assist. Radiol. Surg. 2017, 12, 1141–1149. [CrossRef] [PubMed]
30. Özgüner, O.; Shkurti, T.; Huang, S.; Hao, R.; Jackson, R.C.; Newman, W.S.; Çavuşoğlu, M.C. Camera-robot calibration for the da
vinci robotic surgery system. IEEE Trans. Autom. Sci. Eng. 2020, 17, 2154–2161. [CrossRef] [PubMed]
31. Roberti, A.; Piccinelli, N.; Meli, D.; Muradore, R.; Fiorini, P. Improving rigid 3-d calibration for robotic surgery. IEEE Trans. Med.
Robot. Bionics 2020, 2, 569–573. [CrossRef]
32. Sun, Y.; Pan, B.; Guo, Y.; Fu, Y.; Niu, G. Vision-based hand–eye calibration for robot-assisted minimally invasive surgery. Int. J.
Comput. Assist. Radiol. Surg. 2020, 15, 2061–2069. [CrossRef]
33. Tang, Y.; Zhou, H.; Wang, H.; Zhang, Y. Fruit detection and positioning technology for a Camellia oleifera C. Abel orchard based
on improved YOLOv4-tiny model and binocular stereo vision. Expert Syst. Appl. 2023, 211, 118573. [CrossRef]
34. Valassakis, E.; Dreczkowski, K.; Johns, E. Learning Eye-in-Hand Camera Calibration from a Single Image. In Proceedings of the
Conference on Robot Learning, PMLR, London, UK, 8–11 November 2021; pp. 1336–1346.
35. Huo, J.; Meng, Z.; Zhang, H.; Chen, S.; Yang, F. Feature points extraction of defocused images using deep learning for camera
calibration. Measurement 2022, 188, 110563. [CrossRef]
36. Kim, H.S.; Kuc, T.Y.; Lee, K.H. Hand-eye calibration using images restored by deep learning. In Proceedings of the 2020 IEEE
International Conference on Consumer Electronics-Asia (ICCE-Asia), Seoul, Korea, 1–3 November 2020; pp. 1–4.
37. Low, K.L. Linear least-squares optimization for point-to-plane icp surface registration. Chapel Hill Univ. North Carol. 2004, 4, 1–3.
38. Khamene, A.; Sauer, F. A novel phantom-less spatial and temporal ultrasound calibration method. In Proceedings of the
International Conference on Medical Image Computing and Computer-Assisted Intervention, Palm Springs, CA, USA, 26–29
October 2005; pp. 65–72.
39. Petersen, P. Linear Algebra; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012.
40. Shah, M. Solving the robot-world/hand-eye calibration problem using the Kronecker product. J. Mech. Robot. 2013, 5, 031007.
[CrossRef]