0% found this document useful (0 votes)
9 views16 pages

Sensors 22 08446

Uploaded by

Fong Hd
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views16 pages

Sensors 22 08446

Uploaded by

Fong Hd
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

sensors

Article
A Novel Point Set Registration-Based Hand–Eye Calibration
Method for Robot-Assisted Surgery
Wenyuan Sun † , Jihao Liu † , Yuyun Zhao and Guoyan Zheng *

Institute of Medical Robotics, School of Medical Engineering, Shanghai Jiao Tong University,
Shanghai 200240, China
* Correspondence: [email protected]
† These authors contributed equally to this work.

Abstract: Pedicle screw insertion with robot assistance dramatically improves surgical accuracy and
safety when compared with manual implantation. In developing such a system, hand–eye calibration
is an essential component that aims to determine the transformation between a position tracking
and robot-arm systems. In this paper, we propose an effective hand–eye calibration method, namely
registration-based hand–eye calibration (RHC), which estimates the calibration transformation via
point set registration without the need to solve the AX = XB equation. Our hand–eye calibration
method consists of tool-tip pivot calibrations in two-coordinate systems, in addition to paired-
point matching, where the point pairs are generated via the steady movement of the robot arm in
space. After calibration, our system allows for robot-assisted, image-guided pedicle screw insertion.
Comprehensive experiments are conducted to verify the efficacy of the proposed hand–eye calibration
method. A mean distance deviation of 0.70 mm and a mean angular deviation of 0.68◦ are achieved
by our system when the proposed hand–eye calibration method is used. Further experiments on
drilling trajectories are conducted on plastic vertebrae as well as pig vertebrae. A mean distance
deviation of 1.01 mm and a mean angular deviation of 1.11◦ are observed when the drilled trajectories
are compared with the planned trajectories on the pig vertebrae.
Citation: Sun, W.; Liu, J.; Zhao, Y.;
Zheng, G. A Novel Point Set
Keywords: hand–eye calibration; robot-assisted surgery; pedicle screw insertion; paired-point matching
Registration-Based Hand–Eye
Calibration Method for
Robot-Assisted Surgery. Sensors 2022,
22, 8446. https://doi.org/10.3390/
s22218446 1. Introduction
Pedicle screw insertion is an effective treatment of spinal diseases, such as scoliosis,
Academic Editor: Anna Eva Morabito
in addition to spinal fracture and vertebral injury. Manual implantation is challenging,
Received: 16 October 2022 especially in patients with severe spinal deformity, osteoporosis, or tumor [1–3]. To address
Accepted: 31 October 2022 the challenge, one of the proposed technologies is to integrate a robot arm with a computer
Published: 3 November 2022 navigation system [4–7]. In developing such a system, hand–eye calibration is an essential
Publisher’s Note: MDPI stays neutral component, which aims to determine the homogeneous transformation between the robot
with regard to jurisdictional claims in hand/end-effector and the optical frame affixed to the end-effector [8,9].
published maps and institutional affil- Due to its importance, a number of approaches have been developed to solve the
iations. problem. Hand–eye calibration can be formulated in the form of AX = XB, where A
and B are the robotic end-effector and the optical frame poses between successive time
frames, respectively, and X is the unknown transformation matrix between the robot end-
effector and the optical frame. Many solutions have been proposed to recover X-given
Copyright: © 2022 by the authors. data streams {Ai } and {Bi }. Solutions to the problem can be roughly classified into
Licensee MDPI, Basel, Switzerland. four categories, i.e., separable solutions [10–14], simultaneous solutions [15–17], iterative
This article is an open access article
solutions [8,18–21], and probabilistic methods [22,23]. Specifically, given equations A and
distributed under the terms and
B, it is possible to decompose the equation into rotational and translational parts. Separable
conditions of the Creative Commons
solutions utilize this property to solve hand–eye calibration, where the rotation part is first
Attribution (CC BY) license (https://
solved, followed by solving the translational part. In contrast, simultaneous solutions solve
creativecommons.org/licenses/by/
the rotational and translational parts at the same time. Methods in the third category solve
4.0/).

Sensors 2022, 22, 8446. https://doi.org/10.3390/s22218446 https://www.mdpi.com/journal/sensors


Sensors 2022, 22, 8446 2 of 16

a nonlinear optimization problem by minimizing equations such as ||AX − XB||. As the


algorithm iterates, it will converge on a solution to X. Different from methods of the first
three categories, which assume an exact correspondence between the data streams {Ai }
and {Bi }, methods in the fourth category eliminate such a requirement.
Despite these efforts, accurate hand–eye calibration is challenging for the following
reasons. First, although separable methods are useful, any error in the estimation for
the rotational part is compounded when being applied to solving the translational part.
Second, while simultaneous solutions can significantly reduce the propagation of error [24],
they are sensitive to the nonlinearities present in measurements in the form of noise
and errors [25]. Third, although it was observed that the nonlinear iterative approaches
yielded better results to linear and closed-form solutions in terms of accuracy [25], they
can be computationally expensive to carry out and may not always converge on the
optimal solution.
In this paper, to tackle these challenges, we propose an effective hand–eye calibration
method, namely registration-based hand–eye calibration (RHC), which estimates the cali-
bration transformation via paired-point matching without the need to solve the AX = XB
equation. Specifically, in our solution, we reformulate hand–eye calibration as tool-tip pivot
calibrations in two-coordinate systems and a paired-point matching, taking advantage
of the steady movement of the robot arm and thus reducing measurement errors and
noise. The hand–eye calibration problem is then solved via closed-form solutions to three
overdetermined equation systems. Our point set registration-based hand–eye calibration
method has the following advantages:
• Our method is a simultaneous closed-form solution, which guarantees an
optimal solution;
• Unlike other simultaneous solutions, our solution is obtained by solving three nonlin-
ear least-square fitting problems, leading to three overdetermined equation systems.
Thus, it is not sensitive to the nonlinearities present in measurements in the form of
noise and errors;
• In comparison with the nonlinear iterative approaches, our method requires only
simple matrix operations. Thus, it is computationally efficient;
• Our method achieves better results than the state-of-the-art (SOTA) methods.
The paper is organized as follows. Section 2 reviews related works. Section 3 presents
the proposed method. Section 4 describes the experiments and results. Finally, we present
discussions in Section 5, followed by our conclusion in Section 6.

2. Related Works
Huge amounts of time have been devoted to solve the problem of hand–eye calibration.
Due to the wide applications of robot-assisted procedures, different types of methods
have been developed for increased accuracy and robustness. Existing solutions can be
roughly classified into four categories, as shown in Table 1, i.e., separable closed-form
solutions [10–14], simultaneous closed-form solutions [15–17], iterative solutions [18–20],
and probabilistic methods [22,23].
The earliest approaches separately estimated the rotational and translational parts.
For example, Shiu et al. proposed a method for solving homogeneous transform equa-
tions [10]. Tsai presented an efficient 3D robotics hand–eye calibration algorithm that
computed 3D position and orientation separably [11]. Quaternion-based [13], extrinsic
hand–eye calibration [12], and dual-quaternions-based calibration methods [14] have been
introduced for the individual estimations of rotational and translational parts. One known
problem with separable methods is that any error in the estimation of the rotation matrices
may be propagated to the estimation of the translation vector.
Sensors 2022, 22, 8446 3 of 16

Table 1. Comparison of existing solutions to the hand–eye calibration problem.

Categories Solutions Drawbacks


Solve the rotation part
Separable solutions [10–14] first; then, solve the Error propagation problem.
translational part.
Solve the rotational and Sensitive to the nonlinearities
Simultaneous
translational parts at the present in measurements in
solutions [15–17]
same time. the form of noise and errors.
Solve a nonlinear optimization Computationally expensive;
Iterative solutions [8,18–21] problem by minimizing the may not always converge on
error by iteration. the optimal solution.
Solve the calibration problem
without the assumption of
Probabilistic methods [22,23] Computationally expensive.
exact correspondence between
the data streams.

To avoid the error propagation problem with separable solutions, methods in the
second category simultaneously compute the orientation and position. For example, Lu et al.
proposed an approach that transformed the kinematic equation into linear systems using
normalized quaternions [16]. Andreff et al. proposed an on-line hand–eye calibration
method that derived a linear formulation of the problem [15]. Zhao et al. [17] proposed a
hand–eye calibration method based on screw motion theory to establish linear equations
and simultaneously solve rotation and translation. As confirmed by experimental results,
simultaneous methods have less error than separable solutions [25].
Iterative solutions are another type of method used to solve the problem of error
propagation. For example, Zhuang et al. [18] presented an iterative algorithm to solve the
unknown matrix X in one stage, thus eliminating error propagation and improving noise
sensitivity. Mao et al. [20] proposed using a direct linear closed-form solution followed
by Jacobian optimization to solve AX = XB for hand–eye calibration. Hirsh et al. [26]
proposed a robust iterative method to simultaneously estimate both the hand–eye and
robot–world spatial transformation. Based on a metric defined on the group of the rigid
transformation SE(3), Strobl and Hirzinger [27] presented an error model for nonlinear
optimization. They then proposed a calibration method for estimating both the hand–eye
and robot–world transformations. While iterative solutions are generally accurate, they can
be computationally expensive and may not always converge to the optimal solution [28].
The methods mentioned above assume an exact correspondence between the streams
of sensor data, while methods in the fourth category eliminate such a requirement. For ex-
ample, Ma et al. [23] proposed two probabilistic approaches by giving new definitions of
the mean on SE(3), which alleviated the restrictions on the dataset and led to improved
accuracy. Although it is worth investigating the situation when the exact correspondence
between sensor data is unknown, probabilistic methods usually lead to longer computation
times. Additionally, assuming an exact correspondence is not a problem in our study.
Hand–eye calibration is also an active research topic in medical applications. For ex-
ample, Morgan et al. [29] presented a Procrustean perspective-n-point (PnP) solution for
hand–eye calibration for surgical cameras, achieving an average projection error of 12.99
pixels when evaluated on a surgical laparoscope. Özgüner et al. [30] proposed a solution
for hand–eye calibration for the da Vinci robotic surgical system by breaking down the
calibration procedure into systematic steps to reduce error accumulation. They reported
a root mean square (RMS) error of 2.1 mm and a mean rotational error of 3.2◦ when their
calibration method was used to produce visually-guided end-effector motions. Using
the da Vinci Research Kit (dVRK) and an RGB-D camera, Roberti et al. [31] proposed to
separate the calibration of the robotic arms and an endoscope camera manipulator from the
hand–eye calibration of the camera for an improved accuracy in a 3D metric space. The pro-
posed method reached a sub-millimeter accuracy in a dual-arm manipulation scenario,
Sensors 2022, 22, 8446 4 of 16

while the use of the RGB-D camera limited its actual application in surgery. Sun et al. [32]
proposed a hand–eye calibration method for robot-assisted minimally invasive surgery,
which relied purely on surgical instruments already in the operating scenario. Their model
was formed by the geometry information of the surgical instrument and the remote center-
of-motion (RCM) constraint, outperforming traditional hand–eye calibration methods in
both simulation and robot experiments.
Deep learning-based methods, especially those based on convolutional neural net-
works (CNN), have also been developed for low-level image-processing tasks in hand–eye
calibration [33–36]. For example, Valassakis et al. [34] proposed a sparse correspondence
model that used a U-Net to detect 2D key points for eye-in-hand camera calibration.
Kim et al. [36] introduced deep learning-based methods to restore out-of-focus blurred
images for an improved accuracy in hand–eye calibration.

3. Materials and Methods


3.1. System Overview
Our robot-assisted, image-guided pedicle screw insertion system consists of a master
computer, an optical tracking camera (Polaris Vega XT, NDI, Waterloo, ON, Canada) and a
robot arm (UR 5e, Universal Robots, Odense, Denmark) with a guiding tube. The master
computer communicates with the tracking camera to obtain poses of different optical
tracking frames with the remote controller of the UR robot in order to realize a steady
movement and to receive feedback information.
During pedicle screw insertion, the target point and the aiming trajectory are planned
in a pre-operative CT, which are transformed to the tracking camera space via a homoge-
neous transformation obtained by a surface registration [37]. Then, the pose of the guide
will be adjusted to align with the planned trajectory. Thus, it is essential to determine
the spatial transformation from the tracking camera space to the robot space, as shown
in Figure 1. The transformation can be obtained via two different calibration procedures,
including the hand–eye calibration and guiding tube calibration.

Figure 1. The involved coordinate systems for robot-assisted, image-guided pedicle screw insertion.
During a pedicle screw insertion procedure, the pose of the guide is adjusted to align with a trajectory,
which is planned in a pre-operative CT first, and then is transformed to the patient space via a
surface registration.
Sensors 2022, 22, 8446 5 of 16

Our robot-assisted, image-guided pedicle screw insertion procedure involves the


following coordinate systems (COS), as shown in Figure 1. The 3D COS of the optical
tracking camera is represented by OC ; the 3D COS of the optical reference frame on the end-
effector is represented by O M ; the 3D COS of the robotic flange is represented by OF ; the 3D
COS of the guiding tube is represented by OT ; the 3D COS of the robot base is represented
by OB ; the 3D COS of the pre-operative CT data is represented by OCT ; and the 3D COS of
the optical reference frame attached to the patient/phantom is represented by OR . At any
time, poses of different optical tracking frames with respect to the tracking camera, such as
C T and C T, are known. At the same time, the pose of the robotic flange with respect to the
M R
robot base BF T is known. This transformation information can be retrieved from the API
(application programming interface) of the associated devices.

3.2. Registration-Based Hand–Eye Calibration


The aim of the hand–eye calibration is to establish the spatial transformation between
the robot system and the optical tracking system. Mathematically, we solve the 4 × 4
spatial transformation matrix from the COS O M to the COS OF , referred as FM T. In this
subsection, the proposed registration-based hand–eye calibration (RHC) is introduced,
which mainly consists of two steps: (1) solving tool-tip pivot calibrations in both the optical
tracking camera COS OC and the robot base COS OB ; (2) solving hand–eye calibration via a
paired-point matching.

3.2.1. Tool-Tip Calibration


In the first step, we rigidly fixed a calibration tool with a sharp tip to the flange,
as shown in Figure 2a. We then need to determine the coordinates of the tool tip relative
to the respective two-coordinate systems, i.e., O M and OF . We obtained both by pivot
calibration [38]. Once calibrated, the coordinates of the tool tip with respect to O M and OF
are known, which will then be used in the next step to compute a paired-point matching.
We will start to describe the pivot calibration of the coordinate of the tool tip with
respect to the coordinate system O M . We pivoted the tool tip around a stationary point,
as shown in Figure 2b, to estimate the coordinates of the tool tip in both the optical tracking
camera COS OC and the 3D COS O M of the optical reference frame on the end-effector.
During pivoting, we placed the tool tip in a divot, which has the same size and shape with
the tool tip to avoid any possible sliding. Then, we moved the tool around this pivot point
while always touching the divot with its tip. We denoted, respectively, the two offsets as pC
and p M . During pivoting, we kept pC and p M static while collecting a set of n homogeneous
transformations {(CM T )i = ((CM R)i , (CM t )i ); 1 6 i 6 n} via the tracking camera API. Then,
we estimated pC and p M by solving the following overdetermined equations:

(CM R)1
 C
−I   −( M t )1
  
 .. ..  p M =  .. 
(1)
 . .  p  . 
C
(CM R)n −I −(CM t )n
where I is the 3 × 3 identity matrix.
 C
( M R )1 − I −(CM t )1
  

Defining R̃ =  .. .. , and t̃ = ..
, we have:
  
. .   .
C
( M R)n − I C
−( M t )n
 
pM
R̃ = t̃ (2)
pC
Then, we can solve p M and pC using pseudo-inverse [39]:
 
pM
= (( R̃)T ( R̃))−1 ( R̃)T t̃ (3)
pC
Sensors 2022, 22, 8446 6 of 16

As we are only interested in knowing the offset of the tool tip with respect to O M , we
keep p M while disregarding pC .
Similarly, we can use the same pivot calibration technique to estimate the coordinates
of the tool tip in both the robotic flange COS OF and the robot base COS OB . This time, we
pivoted the tool tip around a stationary point, as shown in Figure 2c. Similarly, we placed
the tool tip in a divot to avoid sliding. We denoted, respectively, the two coordinates as p B
and p F . During pivoting, we kept p B and p F static while collecting a set of l homogeneous
transformations {( BF T )i = (( BF R)i , ( BF t )i ); 1 6 i 6 l } via the robot arm API. Then, we
estimated p B and p F by solving following the following overdetermined equations [39]:

( BF R)1
 B 
−I   −( F t )1
 
 .. ..  p F =  ..  (4)
 . .  p  . 
B
( BF R)l −I −( BF t )l
where I is the 3 × 3 identity matrix.
Again, we are only interested in knowing the offset of the tool tip with respect to OF ;
therefore, we kept p F while disregarding p B .

Figure 2. Tool-tip pivot calibration. (a) The calibration tool with a sharp tip is rigidly fixed to the
flange during the hand–eye calibration; (b) pivot calibration of the offset of the tool tip with respect
to the 3D COS O M of the optical reference frame on the end-effector; (c) pivot calibration of the offset
of the tool tip with respect to the robotic flange COS OF .

3.2.2. Solving Hand–Eye Calibration via Paired-Point Matching


After obtaining the offsets of the tool tip with respect to two-coordinate systems O M
and OF , we can compute the coordinates of the tool tip in both the tracking camera COS OC
and the robot base COS OB at any time via the corresponding device’s API. In this section,
we present an elegant method to solve the hand–eye calibration via paired-point matching
using the setup shown in Figure 3.
Specifically, during the hand–eye calibration, we maintained a stationary spatial
relationship between the robot base and the tracking camera while moving the robot
flange. By controlling  the flange tomove in m different positions, we collected two set
of points PC = ( pC )1 . . . ( pC )m , which are the coordinates of the tool tip measured
in the tracking camera COS OC via ( pC )i = (CM T )i p M and P B = ( p B )1 . . . ( p B )m ,
 

which are the coordinates of the tool tip measured in the robot base COS OB via
( p B )i = ( BF T )i p F . Therefore, we can solve the spatial transformation CB T using a paired-
point matching algorithm.
Sensors 2022, 22, 8446 7 of 16

Figure 3. Solving C
B T via paired-point matching. By controlling the flange to move in m different
positions, we can obtain the coordinates of the tool tip in both the optical tracking camera COS OC
and the robot base COS OB , generating two point sets. C B T is solved by matching the two point sets
using a paired-point matching algorithm.

For the first step to match two paired-point sets, we computed a 3 × 3 matrix H
as follows:
m
1 m 1 m
H = ∑ (( p B ) j − ( ∑ ( p B )i )) · (( pC ) j − ( ∑ ( pC )i )) T (5)
j =1
m i =1 m i =1

We then used the singular value decomposition (SVD) [39] to decompose matrix H
into U, S, and V matrices:
H = USV T (6)
Based on the decomposed matrices, we computed the rotation matrix CB R as:
 
1 0 0
C
BR = V 0
 1 0 U T (7)
0 0 λ

where λ = det(UV ).
Based on CB R, we solved CB t using:
m m
1 1
C
Bt =
m ∑ ( pC )i − CB R m ∑ ( pB )i (8)
i =1 i =1

Therefore, we obtained the spatial transformation CB T as:

C
BT = (CB R, CB t ) (9)

For each position in the movement trajectory, we computed the spatial transformation
( FM T )i as:
( FM T )i = ( BF T )i−1 · (CB T )−1 · (CM T )i (10)
Sensors 2022, 22, 8446 8 of 16

where ( BF T )i and (CM T )i are retrieved from the associated device’s API when generating PC
and P B .
Each position will give a different ( FM T )i . To improve the robustness and to increase
the accuracy, we averaged all the obtained transformations. Specifically, we used (ψi , θi , φi )
to represent the Euler angles of ( FM R)i , so the average rotation matrix FM R can be written as:
m m m
1 1 1
F
MR = R(
m ∑ ψi , m ∑ θi , m ∑ φi ) (11)
i =1 i =1 i =1

where R() represents the transformation from the Euler angles to the rotation matrix.
Meanwhile, the average translation vector FM t can be written as:
m
1
F
Mt =
m ∑ (FM t )i (12)
i =1

where ( FM t )i is the translation vector of ( FM T )i .


Therefore, the hand–eye transformation FM T is composed of the average rotation matrix
F R and average translation vector F t, written as:
M M

F
MT = ( FM R,FM t ) (13)

3.3. Guiding Tube Calibration


To achieve the robot-assisted pedicle screw insertion, the guiding tube that guides the
drilling of a screw insertion trajectory needs to be calibrated. The guiding tube calibration
is a procedure to estimate the transformation TM T of the COS OT defined on the guiding
tube relative to the COS O M of the optical reference frame attached to the robot end-effector.
In this calibration procedure, we utilized two COSs, i.e., the local COS OT of the guiding
tube and the COS O M , as shown in Figure 4.
The local COS OT can be determined using three points: the two end points of the
guiding tube that lie on the center axis of the tube (referred as p(1) and p(2) ), and one
further point that is on the guiding tube (referred as p(3) ). To digitize p(1) and p(2) , we
used a plug to insert into the guiding tube. We then digitized the coordinates of these three
(1) (2) (3)
points in the COS O M , referred as p M , p M , and p M , respectively.

Figure 4. A schematic view the guiding tube calibration. (a) The plug, which can be inserted into
the guiding tube from both ends for digitization. (b) The three points on the tube that are digitized
and the COS OT of the guiding tube established using the three points.
Sensors 2022, 22, 8446 9 of 16

To establish the COS OT , we defined the origin by p(2) , the z-axis by p(1) and p(2) , and
determined the three points by the x-z plane. We obtained the transformation TM T by its
origin and axes, as:
 (x) (y) (z)

rM rM rM (2)
M  kr ( x ) k kr (y ) k kr (z ) k p M 
T T = M M M  (14)
0 0 0 1
where, 
(x) (3) (2) (1) (2) (1) (2)
 r = (( p M − p M ) × ( p M − p M )) × ( p M − p M )
 M


(y) (3) (2) (1) (2)
r M = ( pM − pM ) × ( pM − pM ) (15)


 (z)
 (1) (2)
r M = pM − pM

3.4. Robot-Assisted Pedicle Screw Insertion


Figure 5 illustrates the schematic view of the robot-assisted pedicle screw insertion
procedure. The workflow of the robot-assisted, image-guided pedicle screw insertion
consists of following three steps: (1) pre-operative trajectory planning; (2) intra-operative
registration; (3) transforming the planned trajectory to the robot base COS OB and aligning
the guiding tube with the transformed trajectory.

3.4.1. Pre-Operative Planning


In the first step, we obtained a pre-operative CT scan before the operation. We
segmented the target vertebra in the CT image and defined a trajectory using an entry point
(e) (t)
pCT and a target point pCT in the image space.

Figure 5. A schematic view of robot-assisted pedicle screw insertion: The target trajectory is planned
in the COS OCT and transformed to the COS OR by intra-operative registration. The target trajectory
is further transformed to the robot base COS OB . The guiding tube is aligned with the target trajectory
for insertion guidance.

3.4.2. Intra-Operative Registration


In the second step, we performed an intra-operative registration to establish the spatial
R T from the CT image COS O
transformation CT CT to the COS OR . By digitizing points on
Sensors 2022, 22, 8446 10 of 16

R T. Based on R T,
the surface, we adopted a surface registration algorithm [37] to solve CT CT
(e) (t)
pCT and pCT can be transformed to the COS OR .

3.4.3. Transforming the Planned Trajectory to the Robot Base COS and Aligning the
Guiding Tube with the Transformed Trajectory
In the third step, we transformed the planned trajectory to the robot base COS OB
so that the robot can align the guiding tube with the transformed trajectory, which is
calculated as:
h i h i
(e) (t) B F C −1 C R (e) (t)
pB p B = F T · M T · ( M T ) · R T · CT T pCT pCT (16)

In Equation (16), we retrieved CR T and CM T from the optical tracking camera’s API. FM T
is the hand–eye transformation. We retrieved BF T from the robot arm’s API.

4. Experiments and Results


In this section, we will introduce the experiments and results of our study. We designed
and conducted three experiments to investigate the efficacy of the proposed method: (1) an
investigation of the influence of the range of robot movement to hand–eye calibration; (2) a
comparison with state-of-the-art hand–eye calibration methods; and (3) an overall system
accuracy study.

4.1. Metrics
In the experiments, the performance is quantified by the deviations between the actual
path and the planned trajectory. The deviations consist of the incline angle (unit: ◦ ) and
distance deviation (unit: mm). We used the entry point p(e) and the target point p(t) on the
planned trajectory to measure the distance, as shown in Figure 6. The distance deviation
and incline angle between the guidance path and the planned trajectory are denoted as d
and φ, respectively, while the distance deviation and the incline angle between the drilled
path and the planned trajectory are denoted as d0 and φ0 , respectively.

Figure 6. Metrics used to evaluate the accuracy in this study, including distance deviation d and d0 ,
as well as incline angle φ and φ0 .

4.2. Investigation of the Influence of the Range of Robot Movement to the Hand–Eye Calibration
In this experiment, we investigated the influence of the spatial range of robot move-
ment to the proposed RHC. In the experiment, a plastic phantom was designed and used,
Sensors 2022, 22, 8446 11 of 16

as shown in Figure 7a. The phantom, which was fabricated by 3D printing, had a dimension
of 140 × 90× 85 mm3 , and 25 trajectories were planned on the phantom.
During the hand–eye calibration, the robot is controlled to move in an L × L × L mm3
cubic space. To investigate the influence of the range of robot movement, we calibrated
different hand–eye transformation matrices with an L of 30, 60, 90, 120, 150, or 200 mm.
Each time, after obtaining hand–eye calibration, we used the obtained transformation to
control the robot to align the guiding tube with a planned trajectory. After that, we digitized
the guidance path to evaluate the alignment accuracy.
Experimental results are shown in Figure 7 and Table 2. Both d and φ decreased when
L increased. When L was 200 mm, the mean distance deviation was 0.70 mm and the mean
incline angle was 0.68◦ . The results demonstrate that the larger the robot movement range,
the higher the hand–eye calibration accuracy. However, further increasing the movement
range will lead to a failure in tracking by the camera. We found that the maximally allowed
robot movement range is 200 × 200× 200 mm3 .

Figure 7. Investigation of the influence of the range of robot movement to the hand–eye calibra-
tion: (a) the plastic phantom used in the experiment; (b) the box plots of distance deviation and
incline angle.

Table 2. Investigation of the influence of the range of robot movement to the hand–eye calibration.

d [mm] φ[◦ ]
L [mm] Mean Max. Mean Max.
30 1.17 1.40 0.87 1.25
60 0.86 1.09 0.83 0.93
90 0.82 0.95 0.72 0.91
120 0.86 1.06 0.75 0.90
150 0.71 1.11 0.70 0.85
200 0.70 0.88 0.68 0.96

4.3. Comparison with State-of-the-Art Hand–Eye Calibration Methods


The plastic phantom introduced in Section 4.2 was also used in this study to compare
our method with state-of-the-art (SOTA) hand–eye calibration methods, including Tsai’s
method [11], Andreff’s method [15], Chou’s method [13], Shah’s method [40], and Wu’s
method [8]. Each time, after obtaining hand–eye calibration using one of the mentioned
methods, we used the obtained transformation to control the robot to align the guiding
tube with a planned trajectory. After that, we then digitized the guidance path to evaluate
the alignment accuracy, which reflects the hand–eye calibration accuracy.
The distance deviation d and the angular deviation φ are shown in Figure 8 and Table 3.
We also report the computational time cost for each method in Table 3. In comparison with
the SOTA methods, our method achieved the best results in terms of distance deviation
and incline angle. Meanwhile, the time cost of our method is much lower than the iterative
calibration method [8], as shown in Table 3.
Sensors 2022, 22, 8446 12 of 16

Figure 8. Comparison of our method with the SOTA hand–eye calibration methods.

Table 3. Comparison of our method with the SOTA hand–eye calibration methods.

d [mm] φ[◦ ] Computation


L [mm] Mean Max. Mean Max. Time [ms]
Tsai [11] 0.74 0.92 0.75 0.88 1.18
Andreff [15] 0.73 0.87 0.70 0.92 2.23
Chou [13] 0.73 0.84 0.69 0.89 0.82
Shah [40] 0.74 0.92 0.72 0.97 0.63
Wu [8] 0.72 0.88 0.68 0.90 26.84
Ours 0.70 0.88 0.68 0.96 2.21

4.4. Overall System Accuracy Study


To evaluate the overall system accuracy, we conducted trajectory drilling experiments
on three types of objects: (a) the plastic phantom used in Section 4.2, (b) four 3D-printed
vertebrae, and (c) eight pig vertebrae. Each time, we controlled the robot to align the
guiding tube with the planned trajectory and drilled a trajectory into the test subject.
In total, we planned and drilled 20 trajectories on the plastic phantom, another 8 trajectories
on the 3D-printed vertebrae and further another 8 trajectories on the pig vertebrae. For each
trajectory, after drilling, both the guidance paths and the drilled paths were digitized to
measure the accuracy.
Results are shown in Figure 9 and Table 4. Specifically, on the plastic phantom, the av-
erage distance deviation and the average incline angle between the guiding paths and
the planned trajectories are 0.70 mm and 0.72◦ , respectively, while the average distance
deviation and the average incline angle between the drilled trajectories and the planned tra-
jectories are 0.93 mm and 1.04◦ , respectively. Additionally, on the 3D-printed vertebrae, our
system achieved a slightly better result, i.e., the average distance deviation and the average
incline angle between the guiding paths and the planned trajectories are 0.66 mm and 0.79◦ ,
respectively, and the average distance deviation and the average incline angle between the
drilled trajectories and the planned trajectories are 0.90 mm and 0.96◦ , respectively. Finally,
we evaluated our system accuracy on the pig vertebrae. The average distance deviation
and the average incline angle between the guiding paths and the planned trajectories are
0.71 mm and 0.82◦ , respectively, while the average distance deviation and the average
incline angle between the drilled trajectories and the planned trajectories are 1.01 mm and
1.11◦ , respectively. Figure 9b shows a post-operative CT scan of the drilled path on a pig
vertebra, demonstrating the high accuracy of our system. Both quantitative and qualitative
results demonstrate that our system accuracy is good enough for robot-assisted pedicle
screw insertion.
Sensors 2022, 22, 8446 13 of 16

Figure 9. Overall system accuracy study: (a) the 3D-printed vertebra and pig vertebrae; (b) the CT
image of the animal vertebrae after drilling; (c) the box plots of distance deviation and incline angle.

Table 4. Overall system accuracy study.

3D-Printed
Plastic Phantom Pig Vertebrae
Vertebrae
Mean 0.70 0.66 0.71
d [mm]
Max. 0.85 0.79 0.82
Mean 0.93 0.90 1.01
d0 [mm]
Max. 1.15 1.13 1.52
Mean 0.72 0.79 0.82
φ[◦ ]
Max. 0.94 0.91 0.96
Mean 1.04 0.96 1.11
φ0 [◦ ]
Max. 1.45 1.24 1.38

5. Discussions
Hand–eye calibration is one of the essential components when developing a robot-
assisted, image-guided pedicle screw insertion system. The accuracy of hand–eye cali-
bration will have a direct influence on the system accuracy. However, it is challenging
to develop an accurate and robust method for hand–eye calibration. In this paper, we
proposed an effective hand–eye calibration method based on tool-tip pivot calibration and
paired-point matching without the need to solve the AX = XB equation. Comprehensive
experiments were conducted to validate the accuracy of our proposed hand–eye calibration
method as well as the robot-assisted, image-guided pedicle screw insertion system. Both
qualitative and quantitative results demonstrate the efficacy of our hand–eye calibration
method and the high accuracy of our system.
In comparison with a SOTA hand–eye calibration method, our method has the fol-
lowing advantages: First, our method is a simultaneous closed-form solution, which is
derived by solving three overdetermined equations, guaranteeing an optimal solution.
Second, unlike other simultaneous solutions, we reformulate the hand–eye calibration
problem as solutions to tool-tip pivot calibrations in two-coordinate systems and paired-
point matching, taking advantage of the steady movement of the robot arm, thus reducing
measurement errors and noise. Third, in comparison with methods depending on iterative
Sensors 2022, 22, 8446 14 of 16

solutions [18–21] or probabilistic models [22,23], our method is much faster because it is
not an iterative solution and only requires simple matrix operations.
Based on the novel hand–eye calibration method, we further developed a robot-
assisted, image-guided pedicle screw insertion system. We conducted trajectory drilling
experiments on a plastic phantom, 3D-printed vertebrae, and pig vertebrae to validate
the accuracy of our system. When drilling trajectories on the plastic phantom, our system
achieved a mean distance deviation of 0.93 mm and a mean angular deviation of 1.04◦ .
When it was used to drill trajectories on the 3D-printed vertebrae, our system achieved
a mean distance deviation of 0.90 mm and a mean angular deviation of 0.96◦ . To check
whether the differences between results obtained from the plastic phantom and the 3D-
printed vertebrae are statistically significant, we conducted an unpaired t-test and chose a
significant level of α = 0.05. We found a p-value of 0.52 for the distance deviation and a
p-value of 0.40 for the angular deviation. When drilling trajectories on the pig vertebrae,
our system achieved a mean distance deviation of 1.01 mm and a mean angular deviation
of 1.11◦ , which are regarded accurate enough for pedicle screw insertion.

6. Conclusions
In this paper, we proposed a novel hand–eye calibration method, namely registration-
based hand–eye calibration (RHC), to estimate the calibration transformation via paired-
point matching without the need to solve the AX = XB equation. Based on the proposed
hand–eye calibration method, we developed a robot-assisted, image-guided pedicle screw
insertion system. Comprehensive experiments were conducted to investigate the influence
of the range of robot movement on the hand–eye calibration to compare our method with
state-of-the-art methods and to evaluate overall system accuracy. Our experimental results
demonstrate the efficacy of our hand–eye calibration method and the high accuracy of
our system. Our novel hand–eye calibration method can be applied to other types of
robot-assisted surgery.

Author Contributions: Conceptualization, G.Z.; Data curation, W.S., J.L. and Y.Z.; Formal analysis,
W.S.; Funding acquisition, G.Z.; Investigation, W.S., J.L. and Y.Z.; Methodology, W.S. and G.Z.; Project
administration, G.Z.; Software, W.S.; Supervision, G.Z.; Validation, J.L. and Y.Z.; Visualization, W.S.;
Writing—original draft, W.S.; Writing—review & editing, G.Z. All authors have read and agreed to
the published version of the manuscript.
Funding: This research was partially supported by the Shanghai Municipal Science and Technology
Commission (20511105205) and by the National Natural Science Foundation of China (U20A20199).
Institutional Review Board Statement: The study was conducted in accordance with the Declaration
of Helsinki, and approved by the Institutional Review Board of School of Biomedical Engineering,
Shanghai Jiao Tong University, China (Approval No. 2020031, approved on 8 May 2020).
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.

Abbreviations
The following abbreviations are used in this manuscript:

CT Computed Tomography
API Application Programming Interface
COS Coordinate System
SVD Singular Value Decomposition
RHC Registration-based Hand–eye Calibration
3D Three-dimension
Sensors 2022, 22, 8446 15 of 16

References
1. Tian, N.F.; Xu, H.Z. Image-guided pedicle screw insertion accuracy: A meta-analysis. Int. Orthop. 2009, 33, 895–903. [CrossRef]
[PubMed]
2. Fan, Y.; Du, J.; Zhang, J.; Liu, S.; Xue, X.; Huang, Y.; Zhang, J.; Hao, D. Comparison of accuracy of pedicle screw insertion among
4 guided technologies in spine surgery. Med. Sci. Monit. Int. Med. J. Exp. Clin. Res. 2017, 23, 5960. [CrossRef] [PubMed]
3. Nguyen, N.Q.; Priola, S.M.; Ramjist, J.M.; Guha, D.; Dobashi, Y.; Lee, K.; Lu, M.; Androutsos, D.; Yang, V. Machine vision
augmented reality for pedicle screw insertion during spine surgery. J. Clin. Neurosci. 2020, 72, 350–356. [CrossRef] [PubMed]
4. Solomiichuk, V.; Fleischhammer, J.; Molliqaj, G.; Warda, J.; Alaid, A.; von Eckardstein, K.; Schaller, K.; Tessitore, E.; Rohde,
V.; Schatlo, B. Robotic versus fluoroscopy-guided pedicle screw insertion for metastatic spinal disease: A matched-cohort
comparison. Neurosurg. Focus 2017, 42, E13. [CrossRef] [PubMed]
5. Molliqaj, G.; Schatlo, B.; Alaid, A.; Solomiichuk, V.; Rohde, V.; Schaller, K.; Tessitore, E. Accuracy of robot-guided versus
freehand fluoroscopy-assisted pedicle screw insertion in thoracolumbar spinal surgery. Neurosurg. Focus 2017, 42, E14. [CrossRef]
[PubMed]
6. Kim, H.J.; Jung, W.I.; Chang, B.S.; Lee, C.K.; Kang, K.T.; Yeom, J.S. A prospective, randomized, controlled trial of robot-assisted vs
freehand pedicle screw fixation in spine surgery. Int. J. Med. Robot. Comput. Assist. Surg. 2017, 13, e1779. [CrossRef]
7. Shaw, K.A.; Murphy, J.S.; Devito, D.P. Accuracy of robot-assisted pedicle screw insertion in adolescent idiopathic scoliosis: Is
triggered electromyographic pedicle screw stimulation necessary? J. Spine Surg. 2018, 4, 187. [CrossRef]
8. Wu, L.; Ren, H. Finding the kinematic base frame of a robot by hand-eye calibration using 3D position data. IEEE Trans. Autom.
Sci. Eng. 2016, 14, 314–324. [CrossRef]
9. Liu, G.; Yu, X.; Li, C.; Li, G.; Zhang, X.; Li, L. Space calibration of the cranial and maxillofacial robotic system in surgery. Comput.
Assist. Surg. 2016, 21, 54–60. [CrossRef]
10. Shiu, Y.C.; Ahmad, S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form
AX= XB. IEEE Trans. Robot. Autom. 1989, 5, 16–29. [CrossRef]
11. Tsai, R.Y.; Lenz, R.K. A new technique for fully autonomous and efficient 3 d robotics hand/eye calibration. IEEE Trans. Robot.
Autom. 1989, 5, 345–358. [CrossRef]
12. Wang, C.C. Extrinsic calibration of a vision sensor mounted on a robot. Ieee Trans. Robot. Autom. 1992, 8, 161–175. [CrossRef]
13. Chou, J.C.; Kamel, M. Finding the position and orientation of a sensor on a robot manipulator using quaternions. Int. J. Robot.
Res. 1991, 10, 240–254. [CrossRef]
14. Daniilidis, K. Hand-eye calibration using dual quaternions. Int. J. Robot. Res. 1999, 18, 286–298. [CrossRef]
15. Andreff, N.; Horaud, R.; Espiau, B. On-line hand-eye calibration. In Proceedings of the Second International Conference on 3-D
Digital Imaging and Modeling (Cat. No. PR00062), Ottawa, ON, Canada, 8 October 1999; pp. 430–436.
16. Lu, Y.C.; Chou, J.C. Eight-space quaternion approach for robotic hand-eye calibration. In Proceedings of the 1995 IEEE
International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century, Vancouver, BC, Canada,
22–25 October 1995; Volume 4, pp. 3316–3321.
17. Zhao, Z.; Liu, Y. Hand-eye calibration based on screw motions. In Proceedings of the 18th International Conference on Pattern
Recognition (ICPR’06), Hong Kong, China, 20–24 August 2006; Volume 3, pp. 1022–1026.
18. Zhuang, H.; Shiu, Y.C. A noise-tolerant algorithm for robotic hand-eye calibration with or without sensor orientation measurement.
IEEE Trans. Syst. Man, Cybern. 1993, 23, 1168–1175. [CrossRef]
19. Wei, G.Q.; Arbter, K.; Hirzinger, G. Active self-calibration of robotic eyes and hand-eye relationships with model identification.
IEEE Trans. Robot. Autom. 1998, 14, 158–166.
20. Mao, J.; Huang, X.; Jiang, L. A flexible solution to AX= XB for robot hand-eye calibration. In Proceedings of the 10th WSEAS
International Conference on Robotics, Control and Manufacturing Technology, Hangzhou, China, 11–13 April 2010; pp. 118–122.
21. Zhang, Z.; Zhang, L.; Yang, G.Z. A computationally efficient method for hand–eye calibration. Int. J. Comput. Assist. Radiol. Surg.
2017, 12, 1775–1787. [CrossRef]
22. Li, H.; Ma, Q.; Wang, T.; Chirikjian, G.S. Simultaneous hand-eye and robot-world calibration by solving the AX = YB problem
without correspondence. IEEE Robot. Autom. Lett. 2015, 1, 145–152. [CrossRef]
23. Ma, Q.; Li, H.; Chirikjian, G.S. New probabilistic approaches to the AX= XB hand-eye calibration without correspondence. In
Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May
2016; pp. 4365–4371.
24. Aiguo, L.; Lin, W.; Defeng, W. Simultaneous robot-world and hand-eye calibration using dual-quaternions and Kronecker
product. Int. J. Phys. Sci. 2010, 5, 1530–1536.
25. Ali, I.; Suominen, O.; Gotchev, A.; Morales, E.R. Methods for simultaneous robot-world-hand–eye calibration: A comparative
study. Sensors 2019, 19, 2837. [CrossRef]
26. Hirsh, R.L.; DeSouza, G.N.; Kak, A.C. An iterative approach to the hand-eye and base-world calibration problem. In Proceedings
of the 2001 ICRA, IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164), Seoul, Korea, 21–26 May
2001; Volume 3, pp. 2171–2176.
27. Strobl, K.H.; Hirzinger, G. Optimal hand-eye calibration. In Proceedings of the 2006 IEEE/RSJ International Conference on
Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 4647–4653.
Sensors 2022, 22, 8446 16 of 16

28. Shah, M.; Eastman, R.D.; Hong, T. An overview of robot-sensor calibration methods for evaluation of perception systems. In
Proceedings of the Workshop on Performance Metrics for Intelligent Systems, Gaithersburg, MD, USA, 19–21 August 2012;
pp. 15–20.
29. Morgan, I.; Jayarathne, U.; Rankin, A.; Peters, T.M.; Chen, E. Hand-eye calibration for surgical cameras: A procrustean
perspective-n-point solution. Int. J. Comput. Assist. Radiol. Surg. 2017, 12, 1141–1149. [CrossRef] [PubMed]
30. Özgüner, O.; Shkurti, T.; Huang, S.; Hao, R.; Jackson, R.C.; Newman, W.S.; Çavuşoğlu, M.C. Camera-robot calibration for the da
vinci robotic surgery system. IEEE Trans. Autom. Sci. Eng. 2020, 17, 2154–2161. [CrossRef] [PubMed]
31. Roberti, A.; Piccinelli, N.; Meli, D.; Muradore, R.; Fiorini, P. Improving rigid 3-d calibration for robotic surgery. IEEE Trans. Med.
Robot. Bionics 2020, 2, 569–573. [CrossRef]
32. Sun, Y.; Pan, B.; Guo, Y.; Fu, Y.; Niu, G. Vision-based hand–eye calibration for robot-assisted minimally invasive surgery. Int. J.
Comput. Assist. Radiol. Surg. 2020, 15, 2061–2069. [CrossRef]
33. Tang, Y.; Zhou, H.; Wang, H.; Zhang, Y. Fruit detection and positioning technology for a Camellia oleifera C. Abel orchard based
on improved YOLOv4-tiny model and binocular stereo vision. Expert Syst. Appl. 2023, 211, 118573. [CrossRef]
34. Valassakis, E.; Dreczkowski, K.; Johns, E. Learning Eye-in-Hand Camera Calibration from a Single Image. In Proceedings of the
Conference on Robot Learning, PMLR, London, UK, 8–11 November 2021; pp. 1336–1346.
35. Huo, J.; Meng, Z.; Zhang, H.; Chen, S.; Yang, F. Feature points extraction of defocused images using deep learning for camera
calibration. Measurement 2022, 188, 110563. [CrossRef]
36. Kim, H.S.; Kuc, T.Y.; Lee, K.H. Hand-eye calibration using images restored by deep learning. In Proceedings of the 2020 IEEE
International Conference on Consumer Electronics-Asia (ICCE-Asia), Seoul, Korea, 1–3 November 2020; pp. 1–4.
37. Low, K.L. Linear least-squares optimization for point-to-plane icp surface registration. Chapel Hill Univ. North Carol. 2004, 4, 1–3.
38. Khamene, A.; Sauer, F. A novel phantom-less spatial and temporal ultrasound calibration method. In Proceedings of the
International Conference on Medical Image Computing and Computer-Assisted Intervention, Palm Springs, CA, USA, 26–29
October 2005; pp. 65–72.
39. Petersen, P. Linear Algebra; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012.
40. Shah, M. Solving the robot-world/hand-eye calibration problem using the Kronecker product. J. Mech. Robot. 2013, 5, 031007.
[CrossRef]

You might also like