Master-Slave Control System For Virtual-Physical I
Master-Slave Control System For Virtual-Physical I
Article
Master–Slave Control System for Virtual–Physical Interactions
Using Hands
Siyuan Liu and Chao Sun *
Test Automation and Control Institute, Harbin Institute of Technology, Harbin 150006, China;
[email protected]
* Correspondence: [email protected]
Abstract: Among the existing technologies for hand protection, master–slave control technology
has been extensively researched and applied within the field of safety engineering to mitigate the
occurrence of safety incidents. However, it has been identified through research that traditional
master–slave control technologies no longer meet current production and lifestyle needs, and they
have even begun to pose new safety risks. To resolve the safety risks exposed by traditional master–
slave control, this research fuses master–slave control technology for hands with virtual reality
technology, and the design of a master–slave control system for hands based on virtual reality
technology is investigated. This study aims to realize the design of a master–slave control system
for virtual–physical interactions using hands that captures the position, orientation, and finger joint
angles of the user’s hand in real time and synchronizes the motion of the slave interactive device
with that of a virtual hand. With amplitude limiting, jitter elimination, and a complementary filtering
algorithm, the original motion data collected by the designed glove are turned into a Kalman-filtering-
algorithm-based driving database, which drives the synchronous interaction of the virtual hand
and a mechanical hand. As for the experimental results, the output data for the roll, pitch, and
yaw were in the stable ranges of −0.1◦ to 0.1◦ , −0.15◦ to 0.15◦ , and −0.15◦ to 0.15◦ , respectively,
which met the accuracy requirements for the system’s operation under different conditions. More
importantly, these data prove that, in terms of accuracy and denoising, the data-processing algorithm
was relatively compatible with the hardware platform of the system. Based on the algorithm for the
virtual–physical interaction model, the authors introduced the concept of an auxiliary hand into the
Citation: Liu, S.; Sun, C.
research, put forward an algorithmic process and a judgement condition for the stable grasp of the
Master–Slave Control System for
virtual hand’s, and solved a model-penetrating problem while enhancing the immersive experience
Virtual–Physical Interactions Using
during virtual–physical interactions. In an interactive experiment, a dynamic accuracy test was
Hands. Sensors 2023, 23, 7107.
run on the system. As shown by the experimental data and the interactive effect, the system was
https://doi.org/10.3390/
s23167107
satisfactorily stable and interactive.
Academic Editor: Carina Keywords: master–slave control system; virtual–physical interaction; Kalman attitude fusion algorithm;
Soledad González
stable virtual grasping algorithm
Received: 15 July 2023
Revised: 30 July 2023
Accepted: 4 August 2023
Published: 11 August 2023 1. Introduction
Virtual reality technology, which is one of the most powerful human–computer inter-
faces, has seen rapid development in recent years. Its applications have been extensively
employed in various fields, such as simulation training [1], medical rehabilitation [2], and
Copyright: © 2023 by the authors.
manufacturing [3]. Hand-based virtual–physical interactions have long been a research
Licensee MDPI, Basel, Switzerland.
focus, as humans heavily rely on their hands to interact with the external world [4]. Thus,
This article is an open access article
distributed under the terms and
using both hands for natural and flexible manipulation of virtual objects has become a
conditions of the Creative Commons
primary choice for users. This mode of interaction is more intuitive and convenient than
Attribution (CC BY) license (https:// traditional methods, such as those involving a mouse, keyboard, or touch screen [5], and it
creativecommons.org/licenses/by/ aligns better with human behavioral habits and cognitive logic.
4.0/).
In recent years, research teams and organizations at home and abroad have explored a
multitude of ways of capturing hand behavior and manipulating virtual objects. The main
challenge is controlling and recreating a 3D hand model [6] of a user’s hand movements in
real time with key data parameters, including the spatial position, orientation, and motion
behavior. The current mainstream methods for motion data capture are primarily divided
into two categories [7]: one uses an optical system, and the other involves a control system
composed of various sensors. Although they are capable of independently capturing hand
movements, optical systems fail to bypass certain limitations, such as spatial constraints,
lighting conditions, and obstructions of motion markers [8]. These environmental factors
limit data collection and the fluidity and integrity of motions [9]. Conversely, control
systems based on multiple sensors cleverly circumvent these issues. Their compact design
allows for attachment to the user, thus capturing motion data more accurately and reliably,
and they are even applicable in complex working environments. To enhance the precision
of data acquisition and the effectiveness of calibration techniques, data gloves have been
proposed [10]; these primarily provide mapping between hand angles and the induction
of a sensor, rather than using anatomy-based cross-effects [11]. In the design proposed by
Saggio et al. [12], a novel sensor array structure was adopted to ensure alignment between
the sensors at the fingertips, thus addressing the issues of friction and blockage. Spatial
positioning and tracking schemes are vital innovations concerning VR, the interactions
between computers and humans, and robot control [13]. By calculating the attitude data
of an inertial measurement unit (IMU) installed on the torso, motion trajectories can be
tracked in real time [14]. However, because the precision of the inertial components is not
so high, data drift and cumulative errors occur over time. To overcome this phenomenon, a
Kalman data fusion algorithm is commonly used to estimate the attitude angle.
Since the birth of robotic manipulators, master–slave control has been an important
research topic, as it enables operators to remotely synchronize the control of a manipulator
in various hazardous environments. The research on master–slave control started in the
1940s when researchers developed a robotic arm as a slave control device to replace the
human arm. Currently, there are two mainstream approaches to master–slave control [15]:
One utilizes optical systems, such as cameras and sensors, while the other employs a control
system composed of multiple sensors, such as piezoelectric and inertial sensors [16,17].
In optical systems, the motion data of a hand are captured and tracked by using optical
devices, such as cameras. Shamaie et al. proposed a generic hand-tracking algorithm that
captures hand velocity through cameras and identifies dual-hand actions [18]. Manchanda
et al. introduced a method for controlling mouse pointer movement by using simple
gestures and a camera, and they incorporated a region algorithm for scaling the movement
when the user moved away from the capture point [19]. Lai et al. presented a MEMS-based
motion capture system design and demonstrated its ability to accurately reproduce human
motion with an error of one degree [20]. Ceolini et al. proposed a sensor framework
that integrated complementary systems and a fully neural morphological sensor fusion
algorithm for gesture recognition, and they achieved significantly improved energy–latency
products by 30 to 600 fold [21]. In optical systems, hand-tracking algorithms and optical
devices are used to capture and recognize hand motions. Although optical systems can
independently capture hand movements, there are still challenges that cannot be avoided,
such as spatial limitations, lighting conditions, and occlusion of motion markers during
capture, which restrict data collection, smoothness, and completeness and can prevent
users from experiencing immersion. However, the advantage of optical systems is that they
do not require additional devices to be worn and do not restrict freedom of movement,
making them commonly used for indoor full-body motion capture.
With the advancement of technology and market demands, a wide range of sensors
with complete varieties and reliable performance have flooded the market. Research teams
and development institutions have successfully introduced various wearable devices as
interfaces for human–computer interaction. Data gloves have been widely applied as hand
interaction devices. A data glove [22] is a somatosensory human–computer interaction
Sensors 2023, 23, 7107 3 of 29
device that converts a user’s hand movements into electrical signals through motion data
collection modules on the glove. The host computer consequently receives the transmitted
signals, which are read and interpreted to control the robotic or virtual hand in order to
replicate the hand movements of the user. The primary purpose is to establish a mapping
relationship between the hand and the robotic fingertips and to overcome cross-effects
based on anatomical constraints [23]. In [24], Silva et al. integrated bend sensors into a
glove, developed a low-cost data glove based on the Arduino Uno microcontroller, and
compared it with commercially available data gloves in the achievement of a virtual reality
system. They designed and adopted a novel structure for a sensor array to ensure alignment
between the top sensors of the fingers, thus resolving negligible friction and blocking issues
and reducing deviation from the placement of the internal fingers. In [25], Youngsu et al.
demonstrated the application of piezoelectric sensors as virtual reality interface devices by
using flexible sensors made of PVDF material in a complete system. They compared the
processed sensor outputs with real angles obtained from camera recordings and discussed
the sensor performance. In [26], Mochamad et al. designed a virtual reality system for
smartphones by using IMU (inertial measurement unit) sensors and flex bend sensors
through the Unity engine. The test results showed a reduction in the rotational error to
1%, indicating that there was room for improvement. In [27], Saad et al. discussed an
experimental study correlating the voltage output of a data glove with the bending angles
of fingers. Through an analysis that involved polynomial regression, they allowed the
angle-based interpretation of the voltage output, which is easier for users to understand.
The experimental results demonstrated that the proposed method could accurately convert
voltage into angles with 100% accuracy, thus providing a viable theoretical approach for
future research.
This design is focused on intelligent hand wearable devices and, specifically, the design
of hardware circuits and development of a software system. By fully utilizing resources,
we derived a more reasonable hand posture solution model and applied suitable correction
optimization methods. We designed a master–slave human–computer interaction system
based on virtual reality involving hands, and we established an experimental platform
based on Unity3D and Matlab in order to carry out a series of experiments and research. The
second part of this article discusses the overall design scheme of the system and the design
concept of the HDC data acquisition platform, and it introduces the working principles of
the main functional modules. The third part discusses the analysis and simple processing of
the raw data. The fourth part primarily discusses the deep processing of the data, including
the filtering of the data and the posture solution, which make the data cleaner and easier to
read. The fifth part discusses the system’s functionality and primarily introduces the design
and control principles of a five-finger bionic mechanical hand, in addition to discussing
the virtual grasping algorithm that we used in the virtual interaction process. The sixth
part primarily introduces the debugging of the functions of the whole system, discusses
the communication methods in each part of the system, and describes experiments on the
system’s functionality. The seventh part concludes this study by summarizing the main
content of this research and future research directions.
uses a posture solution algorithm to analyze and calculate the motion information of the
Sensors 2023, 23, 7107 4 of 29
key parts of the hand in real time, transmit it to the background data management system
for data fusion and filtering through wireless communication technology, accurately cal-
culate the current spatial coordinates and motion direction, and transmit them to the
and filtering through
Unity3D-based virtualwireless communication
interaction platform and technology, accurately
master–slave controlcalculate
platformthethrough
current
spatial coordinates and motion direction, and transmit them to the Unity3D-based
serial port communication for synchronous mapping of the operator’s hand motion virtual
and
interaction platform and master–slave control platform through serial port communication
posture. The Interactive process of the master–slave virtual hand interaction system is
for synchronous
shown in Figure 1.mapping of the operator’s hand motion and posture. The Interactive
process of the master–slave virtual hand interaction system is shown in Figure 1.
Figure 1.
Figure Interactiveprocess
1. Interactive processof
ofthe
the master–slave
master–slave virtual–real
virtual–real hand
hand interaction
interaction system.
system.
1. Hand Data Acquisition Platform (HDT)
1. Hand Data Acquisition Platform (HDT)
After analyzing existing hand motion acquisition units, this study summarizes two
After analyzing
mainstream existing
technological hand
routes. Onemotion acquisition
is that of optical dataunits, this study
acquisition summarizes
systems, and the
other is that of information perception control systems composed of various sensors. and
two mainstream technological routes. One is that of optical data acquisition systems,
the other is thatsystems,
In optical of information perception
hand motion data control systems
are collected andcomposed
tracked byof various sensors.
using optical de-
In optical systems, hand motion data are collected and tracked by using
vices, such as cameras. Hand motions are captured and recognized through hand-tracking optical devices,
such as cameras. Hand motions are captured and recognized through hand-tracking
algorithms and optical devices. A complete optical system consists of multiple sets of red
algorithms and optical devices. A complete optical system consists of multiple sets of red
optical lenses, reflective markers, POE switches, calibration frames, and cables, as shown
optical lenses, reflective markers, POE switches, calibration frames, and cables, as shown in
in Figure 2. A typical capture system for optical motion employs several cameras with
Figure 2. A typical capture system for optical motion employs several cameras with high
high speeds for tracking and monitoring specific attributes from various angles, which are
speeds for tracking and monitoring specific attributes from various angles, which are then
then combined with algorithms that involve a skeleton solution to consummate the cap-
combined with algorithms that involve a skeleton solution to consummate the capture. In
ture. In theory, the 3D location of a specific space can be determined at a particular mo-
theory, the 3D location of a specific space can be determined at a particular moment if it is
ment if it is simultaneously observed by at least three cameras. When a camera photo-
simultaneously observed by at least three cameras. When a camera photographs constantly
graphs constantly with a significant frame rate, the trajectory may be derived from the
with a significant frame rate, the trajectory may be derived from the sequence of images.
sequence of images. Although an optical system can independently capture hand motions,
Although an optical system can independently capture hand motions, some problems
some problems cannot be avoided, such as space limitations, lighting conditions, and the
cannot be avoided, such as space limitations, lighting conditions, and the occlusion of
occlusion of motion
motion markers markers
during during the
the capture capture
process. process.
These These environmental
environmental impacts limitimpacts
data
limit
collection and the smoothness and completeness of the motions, making it difficultdif-
data collection and the smoothness and completeness of the motions, making it for
ficult
users for users toan
to achieve achieve an immersive
immersive experience.
experience. However, However, the advantage
the advantage is that
is that such such
systems
systems do not additional
do not require require additional wearables,
wearables, do not human
do not restrict restrict freedom
human freedom of activity,
of activity, and are
and are commonly used for indoor full-body dynamic
commonly used for indoor full-body dynamic data capture. data capture.
Cable
POE Switch
Computer
Figure
Figure 2.
2. Optical
Optical data
data acquisition
acquisition system.
system.
AA hand
hand information
informationperception
perceptionand andcontrol
controlsystem
systemcomposed
composed ofof
sensors, which
sensors, whichis
is also
also known
known as as a “data
a “data glove”,
glove”, is comprised
is comprised of bend
of bend sensors,
sensors, posture
posture sensors,
sensors, signal
signal re-
receivers,
ceivers, and
and microcontrollers,asasshown
microcontrollers, shownininFigure
Figure3.3.Its
Itssmall
smallsize
sizeallows
allowsthe
the system
system to
to be
be
attached to
attached to the
the user
user and
and to adapt to complex environments,
environments, thusthus capturing
capturing motion
motion data
data
more flexibly
more flexibly and
and reliably. By utilizing the relationship between
between the
the output
output voltage
voltage and
and
degree of bending of the bend sensors, the motion information of the fingers is calculated.
Posture sensors are affixed to the main body parts, and posture signals are transmitted to
a microcontroller through wireless transmission methods, such as Bluetooth, for posture
calculation. The posture sensors obtain the posture information of the limbs through inte-
Sensors 2023, 23, 7107 5 of 29
degree of bending of the bend sensors, the motion information of the fingers is calculated.
Posture sensors are affixed to the main body parts, and posture signals are transmitted to
a microcontroller through wireless transmission methods, such as Bluetooth, for posture
calculation. The posture sensors obtain the posture information of the limbs through
integrated inertial sensors, gravity sensors, accelerometers, magnetometers, and gyroscopes.
In combination with the length information and hierarchical connections of the skeleton,
the spatial positions of the joints are calculated. Because such systems are suitable
Sensors 2023, 23, x FOR PEER REVIEW 6 offor
48
complex environments, they can capture motion data more accurately and reliably. The
accuracy and stability of such systems do not decrease due to environmental factors.
Microcontroller
Bending Sensor
Attitude Sensor
Figure 3.
Figure 3. Data
Data glove.
glove.
In
In conclusion,
conclusion, aa design
design scheme
scheme based
based on
on aa hand
hand information
information perception
perception and
and control
control
system
system composed
composed ofof sensors
sensors as
as the
the data
data collection
collection platform
platform isis adopted
adopted in
in our
our system
system to
to
capture the angles of an arm’s posture in real time and to track hands’ motion trajectories
capture the angles of an arm’s posture in real time and to track hands’ motion trajectories
in
in space.
space.
2.
2. Virtual and Real Interaction Platform Based on Unity3D
In virtual
In virtual reality,
reality,interaction
interactionisisindispensable.
indispensable. Interaction
Interaction refers
refers to
to when
when aa program’s
program’s
output generates
output generates results
results that
that are
are consistent
consistent with
with changes
changes andand effects
effects that
that are
are expected
expected
accordingtotoa auser’s
according user’sinput
input behavior.
behavior. Unity3D
Unity3D is ais a comprehensive
comprehensive integration
integration of profes-
of professional
virtual enginesengines
sional virtual that allow
that users
allow to easily
users createcreate
to easily 3D games, real-time
3D games, 3D animations,
real-time and
3D animations,
other types of interactive content. Model movement can be controlled
and other types of interactive content. Model movement can be controlled in Unity3D in Unity3D through
certain
throughscript files,
certain andfiles,
script someand physical
some characteristics, such as collision,
physical characteristics, acceleration,
such as collision, and
accelera-
contact,
tion, andcan be simulated.
contact, In this study,
can be simulated. In this we used
study, weUnity3D as ourasinteractive
used Unity3D program
our interactive pro-
development
gram development platform to implement
platform real-time
to implement mapping
real-time of hand
mapping movements.
of hand movements.After an
After
in-depth
an in-depth study
studyof this platform,
of this we found
platform, we foundthatthat
its modeling capabilities
its modeling are poor,
capabilities and and
are poor, the
software
the softwareallows onlyonly
allows the most basic
the most geometric
basic shapes.
geometric Complex
shapes. Complex models
modelsmust be built
must in
be built
advance by using other software, so we chose to use the 3Dmax software
in advance by using other software, so we chose to use the 3Dmax software for hand mod- for hand modeling
and
elingthen
andimported the data
then imported theinto
dataUnity3D for development.
into Unity3D for development.
3.
3. Master–Slave Control Platform
Themaster–slave
The master–slavecontrol
controlplatform
platformisisthethelink
link
ininthe the physical
physical interaction
interaction with
with thethe en-
entire
tire system.
system. The slave
The slave interaction
interaction device
device ofplatform,
of this this platform, that
that is, theis,robotic
the robotic
hand,hand,
should should
have
have
the the adaptive
adaptive abilityability to grasp
to grasp different
different objectsobjects and should
and should be ablebe toable to complete
complete actions
actions that
that are consistent with the movements of human hands. After studying
are consistent with the movements of human hands. After studying the structure of robotic the structure of
roboticwe
hands, hands,
found wethat
found
the that the keyaffecting
key factors factors affecting the stability,
the stability, convenience,convenience, and effi-
and efficiency of
grabbing
ciency of are the number
grabbing are the ofnumber
fingers, ofthefingers,
numberthe of number
joints, and the number
of joints, of degrees
and the number ofof
freedom
degrees in of the fingers.
freedom in The
the advantages
fingers. Theand disadvantages
advantages of robotic hands
and disadvantages of with different
robotic hands
structures
with differentare shown in Table
structures 1. The
are shown insystems for the
Table 1. The systemspropulsion
for the of industrialofrobots
propulsion are
industrial
classified as hydraulic, pneumatic, or electric based on the origin of their power.
robots are classified as hydraulic, pneumatic, or electric based on the origin of their power. If necessary,
they may be integrated
If necessary, they may be to integrated
form a complex
to form system. Tablesystem.
a complex 2 outlines
Tablethe characteristics
2 outlines of
the char-
each fundamental
acteristics of each system, as well
fundamental as the as
system, relevant
well aspros and cons.pros and cons.
the relevant
Degrees of
Number Number
Freedom of Advantages Disadvantages
of Fingers of Joints
the Fingers
2 3 6 Simple control; no redundancy Poor grasping effect
3 3 9 Good grasping effect Poor adaptive grasping
5 3 5 Strong adaptive effect; no redundancy Average grasping effect
Sensors52023, 23, x FOR3 PEER REVIEW15 Strong grasping ability; good grasping effect Complex control, existence of redundancy
7 of 48
Figure 4.
Figure 4. Structure
Structure of
of the
the system.
system.
Data
Data acquisition
acquisition is
is crucial
crucial to
to this
this system.
system. The
The constructed
constructed HDT
HDT data
data glove
glove employed
employed
flexible
flexible sensors [28] and
sensors [28] andposture
posturesensors
sensorsasashand
hand data
data collection
collection units,
units, thus
thus facilitating
facilitating the
the mapping of virtual palm movements. In the forearm posture tracking
mapping of virtual palm movements. In the forearm posture tracking system, posture system, posture
sensors were also used in the forearm posture detection module. This module’s primary
function was to capture real-time forearm posture angles and track the hand’s motion
trajectory in space. The data captured by the sensors were processed by the glove’s control
unit and the forearm posture module through a Kalman posture fusion algorithm and by
using data-amplitude-limit jitter reduction. The acquired hand motion data were then
Sensors 2023, 23, 7107 7 of 29
sensors were also used in the forearm posture detection module. This module’s primary
function was to capture real-time forearm posture angles and track the hand’s motion
trajectory in space. The data captured by the sensors were processed by the glove’s control
unit and the forearm posture module through a Kalman posture fusion algorithm and by
using data-amplitude-limit jitter reduction. The acquired hand motion data were then
transmitted to the host computer via Bluetooth and Wi-Fi (wireless communication) for
data fusion analysis, and the current spatial coordinates and motion direction were then
calculated.
Sensors 2023, 23, x FOR PEER REVIEW 8 of 48
Figure 5 displays the hardware framework of the HDT, which was composed of
five parts: the main control module (atmega328p microcontroller), the power module
(AMS1117-5-3.3 voltage regulation module), the data collection module (five-way-flexible
sensors, attitude sensor), the monitoring module (current drive chip TB6612), and the
sensors, attitude sensor), the monitoring module (current drive chip TB6612), and the
communication module (Bluetooth HC-05).
communication module (Bluetooth HC-05).
Figure5.5.Diagram
Figure Diagramof
ofthe
thehardware
hardwaresystem’s
system’sframework.
framework.
The
The
(1) Anprimary
externalworkflow
primary workflow ofofthe
3–5 V power theframework
framework
supply isisshown
was connectedshowntoininan
Figure
Figure 5.5.
AMS1117-5-3.3 voltage reg-
(1)circuit
ulation An external
module. 3–5 V power
The principlesupply
of thewas connected tovoltage
AMS1117-5-3.3 an AMS1117-5-3.3 voltage
regulation circuit is
shown in Figure 6. The voltage regulation circuit module transformed the power supplyis
regulation circuit module. The principle of the AMS1117-5-3.3 voltage regulation circuit
shown
into in Figure
a stable 6. The
internal voltage
power supplyregulation
with an circuit
outputmodule
of 5 V (ortransformed
3.3 V). the power supply
into a stable internal power supply with an output of 5 V (or 3.3 V).
+5V U5 +3.3V
3 2
1 IN OUT 4
GND OUT
AMS1117-3.3V
GND
Schematicdiagram
Figure6.6.Schematic
Figure diagramof
ofthe
thevoltage
voltageregulation
regulationcircuit.
circuit.
(2) The LM1117-5 voltage regulation circuit module provided a stable power supply
(2) The LM1117-5 voltage regulation circuit module provided a stable power supply
for the mpu9250 attitude sensor, the flexible sensor, and the main control system, while the
for the mpu9250 attitude sensor, the flexible sensor, and the main control system, while
main control system monitored whether the current was stable by using the pulse wave
the mainbycontrol
output system
the TB6612 monitored
current whether
drive chip. Thethe currentofwas
principle stable by
the circuit using
of the the pulse
current drive
wave output by the TB6612
system is shown in Figure 7.current drive chip. The principle of the circuit of the current
drive system is shown in Figure 7.
Sensors 2023, 23, x FOR PEER REVIEW 10 of 48
Figure7.7.Schematic
Figure Schematicdiagram
diagramofofthe
thecurrent
currentdrive
drivesystem.
system.
Figure8.8.Schematic
Figure Schematicdiagram
diagramof
ofthe
themaster
mastermodule.
module.
(4)
(4)The
Themain
maincontrol
controlsystem
system (atmega328p
(atmega328p microcontroller)
microcontroller)integrated
integratedand
andfiltered
filteredthe
the
data
datathat
thatwere
werecollected
collectedfrom
fromthe
theattitude
attitudesensor
sensorand
andthe
theflexible
flexiblesensor
sensorand
andtransmitted
transmitted
them
themtotoaaPC
PCforforprocessing
processingthrough
throughthetheHC-05
HC-05Bluetooth
Bluetoothmodule.
module.
To facilitate subsequent testing of the practicality and reliability of this hardware sys-
tem, we simply built a 1.0 version of the HDT based on this description of the hardware
framework, as shown in Figure 9.
Sensors 2023, 23, 7107 9 of 29
In
Inthis
thisframework,
framework,the thecollection
collectionof ofhand
handmotion
motiondatadatawas
was crucial.
crucial. WeWe used
used five-way-
five-way-
flexible
flexiblesensors
sensorsand andan anattitude
attitudesensor
sensoras asour
ourdata
datacollection
collectionmodule.
module.The Thefive-way-flexible
five-way-flexible
sensors
sensorswerewereattached
attachedto toparts
partsof ofthe
thefingers
fingersto todetect
detectthethedegrees
degreesofoffinger
fingerbending.
bending.When When
the
the fingers were bent, the controller’s microcontroller was connected to the sensorends
fingers were bent, the controller’s microcontroller was connected to the sensor ends
through
through the the I/O port.When
I/O port. Whenthe thevoltage
voltagechanged,
changed, thethe microcontroller
microcontroller was was able
able to read
to read the
the voltage
voltage value
value through
through thetheI/OI/O
port,port,
andand
the the degree
degree of finger
of finger bending
bending waswas calculated
calculated and
and quantified
quantified in analog
in analog quantities.
quantities. The attitude
The attitude sensorsensor was placed
was placed in the center
in the center of the backof theof
back of the
the hand tohand
detecttoreal-time
detect real-time hand angle
hand posture posture angle
data. Thedata. The data
data were inputwere
into input
the IICinto
pin
the IICmicrocontroller
of the pin of the microcontroller
through thethrough the IIC
IIC interface interface
of the sensor,ofthus
the enabling
sensor, thus enabling
the controller
the controller to obtain data on the hand posture angles. The MPU9250
to obtain data on the hand posture angles. The MPU9250 is an integrated accelerometer, is an integrated
accelerometer,
magnetometer,magnetometer,
and gyroscope and gyroscope
sensor; therefore,sensor; therefore,
it reduced it reduced
the size and power theconsump-
size and
power consumption of the system and allowed the inter-axis difference
tion of the system and allowed the inter-axis difference caused by the presence of multiple caused by the
presence
sensors in ofthe
multiple
systemsensors in the system
to be effectively to be effectively avoided.
avoided.
In
In the hand posture tracking system, wealso
the hand posture tracking system, we alsodeveloped
developedtwo twoupper-
upper-and andlower-arm
lower-arm
modules based on the MPU9250 attitude sensor. They were
modules based on the MPU9250 attitude sensor. They were placed in the central placed in the central position of
position
the upper and lower arms. Their serial port interfaces were connected
of the upper and lower arms. Their serial port interfaces were connected with the serial with the serial port
interface of theof
port interface WIFI
the chip,
WIFI thus
chip,allowing the MPU9250
thus allowing data packet
the MPU9250 to be transformed
data packet into a
to be transformed
wireless signal through the WIFI chip. The main function of this module
into a wireless signal through the WIFI chip. The main function of this module was cap- was capturing the
posture angle of the upper and lower arms in real time and tracking the motion trajectory
turing the posture angle of the upper and lower arms in real time and tracking the motion
of the hand in space.
trajectory of the hand in space.
The posture sensor used in this system was the MPU9250, which consisted of two parts.
The posture sensor used in this system was the MPU9250, which consisted of two
Three-axis accelerometers and three-axis gyroscopes made up one component. The second
parts. Three-axis accelerometers and three-axis gyroscopes made up one component. The
component was the AK8963 from AKM. Therefore, the MPU9250 enabled the tracking of
second component was the AK8963 from AKM. Therefore, the MPU9250 enabled the
motion with nine axes. Its unique motion digital processing engine (DMP) was located
tracking of motion with nine axes. Its unique motion digital processing engine (DMP) was
inside the MPU9250 and could directly process data, thus reducing the number of tasks
located inside the MPU9250 and could directly process data, thus reducing the number of
for the main control chip and requiring only the transmission of the three-axis gyroscope,
tasks for the main control chip and requiring only the transmission of the three-axis gyro-
accelerometer, and magnetometer values to the DMP. The IIC approach was able to directly
scope, accelerometer, and magnetometer values to the DMP. The IIC approach was able
provide all nine axes’ data. Integrated design, movement integration, and clock calibration
to directly provide all nine axes’ data. Integrated design, movement integration, and clock
eradicated the time-consuming choice of complicated processors and peripheral expenses,
calibration
thus eradicated
guaranteeing the time-consuming
the best choice of complicated
performance. Additionally, processors
this device offered an IICand periph-
interface
eral expenses, thus guaranteeing the best performance. Additionally,
to allow compatibility with comparable sensors, such as connecting pressure sensors. this device offered
an IIC interface to allow compatibility with comparable sensors, such
The posture angle solution refers to the angle acquired by accumulating and integrat- as connecting pres-
sure sensors.
ing the acceleration, angular velocity, and magnetic field intensity (all with three axes) via
The posture
the attitude sensor. angle
The solution
results are refers
the to the
roll angle
angle acquired
(roll), pitchby accumulating
angle (pitch), and andyaw integrat-
angle
ing the acceleration, angular velocity, and magnetic field intensity
(yaw). The coordinate system used for the posture angle solution was the east–north–sky (all with three axes) via
the attitude sensor. The results are the roll angle (roll), pitch angle
coordinate system. The module was placed in the positive direction, as shown in Figure 10, (pitch), and yaw angle
(yaw). The coordinate system used for the posture angle solution was the east–north–sky
coordinate system. The module was placed in the positive direction, as shown in Figure
Sensors 2023, 23, x FOR PEER REVIEW 13 of 48
Sensors 2023, 23, 7107 10 of 29
10, with
with the as
the left leftthe
as the x-axis,
x-axis, the the front
front as the
as the y-axis,
y-axis, andand
thethe top
top asas thez-axis.
the z-axis.The
Theorder
orderof
of
rotation of the coordinate system when the Euler angle represented the posture
rotation of the coordinate system when the Euler angle represented the posture was defined was de-
fined
as as i.e.,
z-y-x, z-y-x, i.e.,
first first rotating
rotating aroundaround the z-axis,
the z-axis, then thethen the y-axis,
y-axis, and, finally,
and, finally, the x-axis.
the x-axis.
Figure10.
Figure 10.Diagram
Diagramofofthe
thenorth–east–sky
north–east–skycoordinate
coordinatesystem.
system.
3.3.Data
DataAnalysis
Analysis
To
To obtainthe
obtain motion
the data
motion of the
data fingers,
of the palms,
fingers, andand
palms, arms, a single-chip
arms, microcomputer
a single-chip microcom-
was employed
puter to analyze
was employed and calculate
to analyze the datathe
and calculate acquired by the flexible
data acquired by thesensors
flexibleand posture
sensors and
sensors.
posture sensors.
Sensors 2023, 23, x FOR PEER REVIEW
3.1. 14 of 48
3.1.Analysis
Analysisofofthe
theBending
BendingDegrees
Degreesofofthe
theFingers
Fingers
AAfive-part
five-partflexible
flexiblesensor
sensorwhose
whoseoutput
outputvoltage
voltagevaried
variedwith
withthe
thebending
bendingdeflection
deflection
and was positively correlated with finger flexion was attached to the fingers. The correlation
and was positively correlated with finger flexion was attached to the fingers. The correla-
between the output voltage and finger flexion is illustrated in Figure 11.
tion between the output voltage and finger flexion is illustrated in Figure 11.
Bending
degree
Bending
degree
Figure 11.
Figure 11. Correlation
Correlation between
between output
output voltage
voltageand
andfinger
fingerflexion.
flexion.
The
The output
output voltage
voltage ofof the
the flexible
flexible sensors
sensors ranged
ranged from
from 1000
1000 mv
mv to
to 4000
4000 mv.
mv. Thus,
Thus, itit
was assumed that the data read by the single-chip microcomputer were from AF1 toAF
was assumed that the data read by the single-chip microcomputer were from AF 1 to AF55..
Though a byte is in the numerical range of 0–255, the most suitable range for
Though a byte is in the numerical range of 0–255, the most suitable range for mapping mapping
is
is from
from 50
50 to
to200;
200;then,
then,aabyte
bytecan
canbebeused
usedto torepresent
representaafinger’s
finger’sflexion.
flexion.
ana ==AFn /20
AF / 20
n n
an ∈ (50, 200)
a ∈ (50, 200)
AF1.1 , the data of the first joint in thenthumb, ranges from −100◦ to 50◦ in Unity. Thus,
Equation
AF1.1(1)
, the(AF1.1 =
data 1 − first
ofathe 150)joint
can be
in used to mapranges
the thumb, this range.
from −100° to 50° in Unity. Thus,
Equation (1) (AF1.1 = a1 − 150) can be used to map this range.
Ltz = (lfresult1[1] − 150) + UI.lthum_a (1)
Ltz = (lfresult1[1] − 150) + UI.lthum_a (1)
Here, ltz refers to AF1.1, lfresult1[1] refers to a1, and UI.lthum_a refers to the deviation
value of the slider, which was manually set in the interface of Unity so as to adjust the
finger model. Naturally, the data of the remaining 13 joints can be calculated in the same
way. The only difference among these data lies in the range of flexion of the joints.
Sensors 2023, 23, 7107 11 of 29
Here, ltz refers to AF1.1 , lfresult1[1] refers to a1 , and UI.lthum_a refers to the deviation
value of the slider, which was manually set in the interface of Unity so as to adjust the
finger model. Naturally, the data of the remaining 13 joints can be calculated in the same
way. The only difference among these data lies in the range of flexion of the joints.
ay
θ = arctan (3)
az
a x , ay , and az refer to the acceleration values on the three axes. Since the accelerometer
could not sense the rotation angle on the z-axis, the magnetometer, which was used to
measure the magnetic induction intensity, needed to be employed to help calculate the yaw.
(1) When the magnetometer was parallel to the ground, the yaw could be calculated
on the basis of Equation (4).
n
my
ψ = arctan (4)
mnx
mny and mnx refer to the magnetic field intensity on the x-axis and y-axis.
(2) When the magnetometer was not parallel to the ground, tilt compensation could
be adopted to reduce errors and help calculate the yaw. Due to the angle between n
(geographic) and b (carrier) that was included when using these as the coordinate systems,
Equation (5) was necessary for the geographic coordinate system’s magnetic field.
b
mnx
mx
mny = C n mby (5)
b
mnz mbz
mbx , mby , and mbz refer to the magnetic field intensity measured on the z-axis. Cbn refers
to the cosine orientation matrix.
cos β sin β sin θ sin β cos θ
Cbn = 0 cos θ − sin θ (6)
− sin β cos β sin θ cos β cos θ
With Equations (5) and (6), the tilt compensation formula can be deduced.
mnx = mbx cos β + mby sin β sin θ + mbz sin β cos θ (7)
4. Data Calibration
4.1. Calibration of the Bending Degrees of the Fingers
The data output by the flexible sensor were the values of the voltage, which could
experience small jumps. Therefore, a cumulative jitter elimination filtering algorithm was
used. When the collected data continuously jumped beyond the set valve, this indicated
that the overall numerical range of the element had changed. Therefore, a new element
was assigned to the output value. If the change did not exceed the threshold or did not
continuously exceed the threshold, this indicated that the change in the data was in a small
range of noise, which was filtered out, and the output remained at the original value. This
algorithm can effectively remove small-range jitters in data, thus making the data smoother.
For possible high-amplitude errors, a limited-amplitude filtering algorithm was adopted to
identify and eliminate noise that exceeded a certain amplitude and to take the true value of
the data within the threshold. The original data were filtered through these two filtering
Sensors 2023, 23, x FOR PEER REVIEW 16 of 48
algorithms to make them more realistic and stable. A comparison of the effects before and
after filtering is illustrated in Figure 12.
original data
filtering data
KF
Figure 13.
Figure 13. Flowchart
Flowchart of
of Kalman
Kalman filtering
filtering algorithm.
algorithm.
xxk refers
k refers to the state vector at moment k. yk refers to the observation vector at
to the state vector at moment k. yk refers to the observation vector at moment
k. A refers to the state-transition matrix from moment k to moment k − 1. H refers to
moment k. A refers to the state-transition matrix from moment k to moment k − 1. H refers
the gain matrix from the state vector to the observation vector. q and r refer to the input
to theand
noise gainthematrix from the noise,
observation state vector
whose tocovariance
the observation vector.
matrices areq represented
and r refer toby
theQinput
and
R. Supposing that −x̂k is the estimated value at moment k and x̂k is the adjustedQ
noise and the observation
− noise, whose covariance matrices are represented by and at
value R.
Supposing
moment xˆ k following
that the
k, then is true. value at moment k and xˆ k is the adjusted value at
is the estimated
moment k, thenthe
Assuming theestimated
followingvalue
is true.
at time k and the corrected value at time k, then
Assuming the estimated value at time k and the corrected value at time k, then
x̂k− = A x̂k−1 + q (9)
ˆxk− = Axˆk −1 + q (9)
− −
x̂k = x̂k + K yk − H x̂k (10)
xˆk = xˆ kwhich
where K refers to the Kalman gain matrix,
−
+ K isyka−key(
Hxˆfactor
k
−
)
in accurately estimating (10)
the
state.
where K refers to the Kalman gain matrix, h which is a keyifactor
−1 in accurately estimating
Kk = Pk|k−1 H T HPk|k−1 H T + R (11)
the state.
Here, Pk|k−1 refers to the pre-estimated error covariance matrix.
−1
Kk = Pk|k −1H T HPk|k −1H T + R (11)
T
Pk|k−1 = APk−1 A + Q (12)
Here, Pk | k −1 refers to the pre-estimated error covariance matrix.
Pk refers to the error covariance matrix at moment k.
Pk |k −1 = APk −1 AT + Q (12)
Pk = ( I − KK H ) Pk|k−1 (13)
Pk refers to the error covariance matrix at moment k.
I refers to the unit matrix.
I refers
1 −to the
0.01
unit matrix.
0
1 0
0.2 0
AIn=the source code = 1 0 ,the
, H program, =
X̂0 initial , P0 = , Q = of the Kalman
, R = 2pose
0 1 0values of the
0 coefficients
1 0 0.1
fusion algorithm were as follows:
The attitude estimation model derived above was implemented by modifying certain
parameter codes—specifically, matrices A and H—as well as the dimensions of other
matrices. The results of the implementation are shown in Figure 14, where the blue line on
the left represents the observed waveform of sinusoidal data with an increasing amplitude
and overlaid with random noise. The red line represents the waveform of the data after
they were filtered with the Kalman filter. The right side of the figure shows a localized
comparison that clearly indicates that the filtered data were smoother than the original
waveform.
Here, cQ and cR correspond to Q and R, respectively, in the equation. By reducing the
ratio of cQ/cR, the waveform was obtained, as shown in Figure 15, which demonstrates
both conciseness and professionalism.
Sensors 2023, 23, x FOR PEER REVIEW 19 of 48
db
−200
Sensors 2023, 23, x FOR PEER REVIEW 20 of 48
−400
hz
effects of the Kalman filter.
Figure 14. Comparison of the initial effects filter.
Here, cQ and
1.000000 ×10−4 cR correspond
1.000000 ×10to
−8
Q and R, respectively, in the equation. By reducing
the ratio
*****of cQ/cR, the waveform was obtained, as shown in *****
Observed Data Figure 15, which demon-
Observed Data
−200
Sensors 2023, 23, x FOR PEER REVIEW 21 of 48
−400
hz
Figure 15.
Figure 15. Comparison
Comparison of
of the
the effects
effects after
after adjusting
adjusting the
the parameters.
parameters.
In Figure
In Figure16, 16,the
thefiltering
filteringresults
results closely
closely match
match thethe observed
observed values,
values, indicating
indicating thatthat
the
the Kalman
Kalman filterfilter allowed
allowed highhigh confidence
confidence in observations
in the the observations at this
at this point.
point.
−200
−400
hz
Figure 16.
Figure 16. Effect
Effect of
of the
the Kalman
Kalman filter
filter with
with high
high confidence.
confidence.
The
The choice of observations
choice of observationsisiscrucial
crucialininthe
theKalman
Kalman filtering
filtering algorithm.
algorithm. TheThe
sizesize of
of the
the observation error directly affects the effectiveness of Kalman filtering. To
observation error directly affects the effectiveness of Kalman filtering. To enhance the sys- enhance the
system’s resistance
tem’s resistance to interference,
to interference, ensure
ensure the control
the control precision,
precision, enhance
enhance the stability
the stability of
of state
state observations, and mitigate the effects of external forces on observations,
observations, and mitigate the effects of external forces on observations, the Kalman fu- the Kalman
fusion complementaryfiltering
sion complementary filteringalgorithm
algorithmwaswasused
usedininthe
theattitude
attitude solution
solution algorithm
algorithm of of this
this
system. We chose the attitude angle from the fusion of the complementary
system. We chose the attitude angle from the fusion of the complementary accelerometer accelerometer
and
and magnetometer
magnetometer filters
filters as
as the
the state
state observation
observation variables
variables for
for the
the Kalman
Kalman algorithm,
algorithm, andand
we utilized the gyroscope’s data and noise to establish the state prediction equation for
we utilized the gyroscope’s data and noise to establish the state prediction equation for
iteration. This allowed for the real-time acquisition of highly accurate and trustworthy
iteration. This allowed for the real-time acquisition of highly accurate and trustworthy
attitude angles. The structure after the fusion of the two algorithms is shown in Figure 17.
attitude angles. The structure after the fusion of the two algorithms is shown in Figure 17.
The choice of observations is crucial in the Kalman filtering algorithm. The size of the
observation
Sensors 2023, 23, x FOR PEER REVIEW
error directly affects the effectiveness of Kalman filtering. To enhance the sys-
22 of 48
tem’s resistance to interference, ensure the control precision, enhance the stability of state
Sensors 2023, 23, 7107 observations, and mitigate the effects of external forces on observations, the Kalman 15 offu-
29
sion complementary filtering algorithm was used in the attitude solution algorithm of this
system. We chose the attitude angle from the fusion of the complementary accelerometer
and magnetometer filters as the state observation variables for the Kalman algorithm, and
weaccelerometer
utilized the gyroscope’s data and Mahony
Attitude solution filter
noise to establish the state prediction equation for
iteration. This allowed for the real-time acquisition of highly Kalmanaccurate
filter and trustworthy
attitude angles. The structure after the fusion of the two algorithms
magnetometer Measured valueis shown in Figure 17.
of the attitude φθψ
Predicted value
accelerometer
gyroscope Attitude
Attitudesolution
solution Mahony filter
of the attitude
Kalman filter
magnetometer
Figure
Figure 17.Schematic
17. Schematicdiagram
diagramofofthe
thefusion
fusionalgorithm.
algorithm. Measured value
of the attitude φθψ
Predicted value
The fundamental structure of the complementary filtering algorithm is depicted in
gyroscope Attitude solution of the attitude
Figure 18, and the relevant formula is presented in Equation (14).
Figure 17. Schematic diagram of the
fusion algorithm.
φ = φg + k(φam − φg )
θ = θ g + k(θ am − θ g ) (14)
The fundamental structureof the complementary filtering algorithm is depicted in
ψ = ψg + k(ψam − ψg )
Figure 18, and the relevant formula is presented in Equation (14).
Figure 18.
Figure 18. Schematic
Schematic diagram
diagram of
of the
the complementary
complementary filter
filter algorithm.
algorithm.
Here, φg θ g ψg and φam θ am ψam represent the attitude angles that were calculated from
the gyroscope, accelerometer, andφmagnetometer,
= φg + k (φam respectively,
− φg ) while φ θ ψ signifies the
attitude angle after fusion.
θ =accuracy
To further verify the stabilityand θ g + k (θof − θposture
am the g) (14)
data, we fixed the attitude
sensor on a DC motor (MAXON RE35) with an encoder, rotated it 45 around its x-, y-, and
◦
ψ = ψ g + k (ψoutput
z-axes, respectively, and obtaineda set of sensor am − ψdata,
g ) as shown in Table 3.
The dynamic output waveform of the 45◦ rotation around the x-axis in the positive
direction and the return to the initial state is shown in Figure 19. The motor was set to start
rotating 45◦ in the positive direction around the x-axis (roll angle) at moments 0–3, to return
to the initial state at moments 3–6, and to stay at moments 6–8. From the waveform, it can
be seen that the process of change in the angle was stable and smooth, with no distortion;
moreover, it remained relatively stationary in the direction of the y-axis (pitch angle) and
z-axis (yaw angle), and the output waveform was stable within ±0.2◦ . In the tests with two
posture sensors, the outputs of the roll, pitch, and yaw were stable at ±0.1◦ , ±0.15◦ , and
±0.15◦ , respectively, signifying the suitable IMU sensor compatibility of the algorithm with
respect to the reduction in noise and precision [32]. These data can be used to map hand
posture actions in three-dimensional space.
form, it can be seen that the process of change in the angle was stable and smooth, with
no distortion; moreover, it remained relatively stationary in the direction of the y-axis
(pitch angle) and z-axis (yaw angle), and the output waveform was stable within ±0.2°. In
the tests with two posture sensors, the outputs of the roll, pitch, and yaw were stable at
±0.1°, ±0.15°, and ±0.15°, respectively, signifying the suitable IMU sensor compatibility of
Sensors 2023, 23, 7107 16 of 29
the algorithm with respect to the reduction in noise and precision [32]. These data can be
used to map hand posture actions in three-dimensional space.
Figure 19.
Figure 19. Output
Output waveform
waveform of
of the
the posture
posture sensor.
sensor.
Euler
Euler angles
anglesare
arethe
themost
mostfamiliar form
familiar formof of
angles to many
angles people.
to many Their
people. three
Their axesaxes
three are
coupled and and
are coupled display independent
display variations
independent onlyonly
variations in small angles.
in small For larger
angles. values,
For larger the
values,
attitude angles change in a coupled manner. For example, when the x-axis approached
90 degrees, even if the attitude rotated solely around the x-axis, the y-axis angle also
underwent significant changes. This is an inherent characteristic of the representation of
Euler angles, and it is known as gimbal lock. In Unity3D, Euler angles also suffer from
gimbal lock issues, leading to spasms and halts when two axes are on the same plane. To
avoid gimbal lock issues, angles must be represented in quaternion form, as shown in
Equation (15):
q = λ + P1 i + P2 j + P3 k (15)
where λ represents the scalar part, while P1 i + P2 j + P3 k stands for the vector part. The
conversion from Euler angles into quaternions is shown in Equation (16).
λ cos(φ/2) cos(θ/2) cos(ψ/2) + sin(φ/2) sin(θ/2) sin(ψ/2)
P1 cos(φ/2) sin(θ/2) cos(ψ/2) + sin(φ/2) cos(θ/2) sin(ψ/2)
q=
P2 = cos(φ/2) cos(θ/2) sin(ψ/2) − sin(φ/2) sin(θ/2) cos(ψ/2)
(16)
P3 sin(φ/2) cos(θ/2) cos(ψ/2) − cos(φ/2) sin(θ/2) sin(ψ/2)
In this equation, φ, θ, ψ denote the rotation angles around the x-, y-, and z-axes,
respectively. They are expressed by using Tait–Bryan angles—specifically, the roll, pitch,
and yaw.
Figure20.
Figure 20.Configuration
Configurationof
ofthe
themechanical
mechanicalhand.
hand.
Figure22.
Figure 22.Motion
Motionmodel
modelofofthe three-linkstructure.
thethree-link structure.
The
Thecharacteristics
characteristicsof ofthe
thelink
linkstructure
structurein inKinematics
Kinematicscan canbe beinterpreted
interpretedwith withD–H
D–H
theory.
theory.That
Thatisistotosay,
say,any
any position
position of ofthethehand
hand in in
thethe local coordinates
local coordinates cancanbe interpreted as
be interpreted
the following
as the matrix
following A inAglobal
matrix in globalcoordinates.
coordinates.
Power switch
Figure
Figure23.
23.Five-fingered
Five-fingeredbionic
bionicmanipulator.
manipulator.
Currently,two
Currently, twoprevalent
prevalentmapping
mappingmethodsmethodsare areininuse:
use:joint
jointangle
anglemapping
mappingand andfin-
fin-
gertip position mapping. Fingertip mapping projects the operator’s fingertip
gertip position mapping. Fingertip mapping projects the operator’s fingertip position onto position
aonto a robotic
robotic hand
hand to to enable
enable accurate accurate
mirroringmirroring of the operator’s
of the operator’s hand movements
hand movements in three- in
three-dimensional space. To ensure smooth mapping, it is necessary to establish corre-
dimensional space. To ensure smooth mapping, it is necessary to establish corresponding
sponding palm and joint coordinate systems between the human hand and the robotic
palm and joint coordinate systems between the human hand and the robotic hand, thus
hand, thus ensuring one-to-one correspondence. Compared with that in fingertip mapping,
ensuring one-to-one correspondence. Compared with that in fingertip mapping, the com-
the computation process in joint angle mapping is more streamlined, as only proportional
putation process in joint angle mapping is more streamlined, as only proportional map-
mapping is required to generate movement instructions for the robotic hand. The effects of
ping is required to generate movement instructions for the robotic hand. The effects of
both control methods can meet the requirements of work environments that do not require
both control methods can meet the requirements of work environments that do not re-
precise control. In this system, we selected joint angle mapping as our motion mapping
quire precise control. In this system, we selected joint angle mapping as our motion map-
method. The implementation process for master–slave control is illustrated in Figure 24.
ping method. The implementation process for master–slave control is illustrated in Figure
Before executing master–slave control, the operator first needs to wear a data glove and
24. Before executing master–slave control, the operator first needs to wear a data glove
perform maximum-extension and fist-clenching movements. The purpose of this is for the
and perform maximum-extension and fist-clenching movements. The purpose of this is
data management software to record the maximum (A_Umax) and minimum (A_Umin)
for the data
bending management
angles software
of the five finger jointsto of
record the maximum
the operator. (A_Umax)
Given the and
structural minimum
constraints of
(A_Umin) bending angles of the five finger joints of the operator. Given
our robotic hand, the range of motion of each finger joint was predetermined; thus, we the structural con-
straints
also setof our
the robotic hand,
maximum the range
(A_Cmax) of motion of
and minimum each finger
(A_Cmin) joint was
bending predetermined;
angles of the robotic
thus, we also set the maximum (A_Cmax) and minimum (A_Cmin)
finger joints. At a certain moment, if the bending angles of the operator’s hand bending anglesand
of the
the
robotic finger joints. At a certain moment, if the bending angles of
robotic hand were A_Umid and A_Cmid, respectively, then A_Cmid could be obtained the operator’s hand and by
the robotic
using hand(17).
Equation were A_Umid and A_Cmid, respectively, then A_Cmid could be obtained
by using Equation (17).
A_Umid − A_Umin
A_Cmid = A _ U mid − A _ U min× ( A_Cmax − A_Cmin ) + A_Cmin (17)
A _ Cmid
A_U= max − A_Umin × ( A _ Cmax − A _ Cmin ) + A _ Cmin (17)
A _ U max − A _ U min
Sensors 2023,23,
Sensors2023, 23,7107
x FOR PEER REVIEW 19 of
29 of 29
48
Start
Robot linkage
Implement master-slave control
End
Figure24.
Figure Master–slavecontrol
24.Master–slave controlprocess.
process.
5.2.
5.2.Realization
RealizationofofVirtual–Physical
Virtual–PhysicalInteraction
Interaction
5.2.1. Realization of Virtual Grasping
5.2.1. Realization of Virtual Grasping
The tracking device and the driven virtual hand were unidirectionally coupled, which
The tracking device and the driven virtual hand were unidirectionally coupled,
meant that the former could drive the latter, but the latter could not drive the former. As
which meant that the former could drive the latter, but the latter could not drive the for-
a result, there was no control with a closed loop between the user’s hands and virtual
mer. As a result, there was no control with a closed loop between the user’s hands and
hand. As a result, the latter’s fingers could penetrate into a virtual object, and the engine
virtual hand. As a result, the latter’s fingers could penetrate into a virtual object, and the
was unable to address the issue, as tracking data powered the hand without taking
Sensors 2023, 23, x FOR PEER REVIEW ofany
30taking
48
engine was unable to address the issue, as tracking data powered the hand without
simulation instabilities into account. To address the issue of penetration, we devised a
any simulation instabilities into account. To address the issue of penetration, we devised
proxy model that was linked to the virtual hand and was capable of providing a visual
a proxy model that was linked to the virtual hand and was capable of providing a visual
reaction to its environment. Collision and grabbing occurred between the 3D virtual object
reaction to its environment. Collision and grabbing occurred between the 3D virtual object
and the proxy hand. The process is shown in Figure 25.
and the proxy hand. The process is shown in Figure 25.
Start
NO
Yes
End
Figure
Figure 25.
25. Collision
Collisionand
and grabbing
grabbing process.
process.
This problem is seen as the problem of penetration depth [33]. Figure 26 shows a
penetration analysis of the proxy hand from both the geometric and graphical perspec-
tives. The penetration depth is a measure of the degree of mutual penetration between
Sensors 2023, 23, 7107 20 of 29
This problem is seen as the problem of penetration depth [33]. Figure 26 shows a
penetration analysis of the proxy hand from both the geometric and graphical perspec-
tives. The penetration depth is a measure of the degree of mutual penetration between
collision penetration models. It is widely used in path planning, physics-based simulation,
haptic simulation, and computer-aided modeling. The classifications of penetration depth
are translational (PDt) and generalized (PDg). The primary distinction is that the latter
can detach overlapping interactive objects at the same time. We assume that there are
two models that overlap, A and B, with A moving and B remaining stationary. Object A’s
initial local coordinates are identical to those of the global system O. Objects A and B can be
separated by using a fixed motion. To calculate the minimum penetration depth for objects
A and B, PDg is in the six-DOF space. The depth measurement is provided in Equation
(18) [34]:
Sensors 2023, 23, x FOR PEER REVIEW PDgσ (A, B) = {min{σA (q, 0)}||interior(A(q)) ∩ B = ∅, q ∈ F } 31 (18)
of 48
where q is the initial collision state of A, F is the non-collision space, and σ is the measure
of the distance between the two poses of A in a six-degree-of-freedom space. Here, we refer
to it as the target norm.
Figure26.
Figure 26.Collision andgrabbing
Collisionand grabbingprinciple.
principle.
Usually,
Usually, any
any distance
distance metric
metric can can be chosen to
be chosen to define
define σ. σ .Here,
Here,thethetarget
targetnorm
norm isis
chosen
chosenas asthe
theunit
unitof
ofdistance
distancemeasurement,
measurement,as asitithashasaa compact
compactanalytical
analyticalexpression
expressionandand
isisinvariant to the reference coordinate system. For object A, the
invariant to the reference coordinate system. For object A, the target norm in positions target norm in positions
q0 and q1 and in position q1 is defined as
q0 and q1 and in position q1 is defined as
σA (q0 , q1 ) = V1 1x∈ A ( x (q0 ) − x (q1 ))2
R
σ A (q 0 , q=1 ) =4 ( Ixxq2 (+x (Iqyy0 q) 2−+x (Iqzz1 )) 2
q23 ) + q24 + q25 + q26
(19)
V V x∈1A 2
(19)
4 2 2 2 2 2 2
where x (q) refers to the point where = ( I xx q1 + I yy q 2 + I zz q 3 )1+ q24 +3q 5 + q 6
A is in position q. [ q , q , q ] refers to the quaternion
vector—or the quaternion vector ofVthe rotation transformation—which marks the relative
azimuthxdifference between q0 and q1 . [q4 , q5 , q6 ] refers to[qthe relative position difference
where (q) refers to the point where A is in position q. 1 , q 2 , q 3 ] refers to the quaternion
between q0 and q1 . V refers to the volume of A, and I refers to the diagonal element of the
vector—or the quaternion vector of the rotation transformation—which marks the relative
inertial tensor matrix of A.
azimuth difference between q 0 and q1 . [q4 , q5 , q6 ] refers to the relative position difference
The virtual model was mapped to an auxiliary model for a dynamic simulation of
between q 0 and q1 . V refers to the volume of A, and I refers to the diagonal element of
the two objects. The tracking of the initial motion of the virtual hand and the auxiliary
the inertial
hand could tensor matrix of
be controlled onA.the basis of the acquired data and the contact state. In the
interaction,
The virtual model was mappedmain
the auxiliary hand—the to aninteractive
auxiliary modelhand—was used to test
for a dynamic whether of
simulation there
the
was
twoan effective
objects. Thecontact.
trackingThe whole
of the interaction
initial motion of can thebevirtual
interpreted in terms
hand and of four states:
the auxiliary hand
the contactless
could state, on
be controlled contact state,ofpenetration
the basis the acquired state,
dataand
andrelease state (Figure
the contact state. In27).
the interac-
tion, the auxiliary hand—the main interactive hand—was used
(1) Contactless State (S1): The hand is not in contact with the manipulated object.to test whether there was
an effective contact. The whole interaction can be interpreted in terms
(2) Contact State (S2): Starting to separate from the virtual hand, the auxiliary hand of four states: the
contactless
remains state, contact
at the state,
contact spot. penetration state, and release state (Figure 27).
(3) Penetration State (S3): The auxiliary hand grasps the object that is being manipulated
as the virtual one penetrates it.
Sensors 2023, 23, 7107 21 of 29
Sensors 2023, 23, x FOR PEER REVIEW 32 of 48
(4) Release State (S4): The auxiliary hand moves away with the virtual one, thereby
releasing the object that is being manipulated.
S
S1:Contactless state S2:Contact state S3:Penetration state
Figure27.
Figure 27.Three
Threestates
statesof
ofcontact
contactand
andgrabbing.
grabbing.
(1) IfContactless
the hand and the
State manipulated
(S1): The hand is object are
not in not inwith
contact contact, the auxiliary hand
the manipulated object.moves
with
(2) the virtual
Contact hand.
State (S2): However,
Starting toonce the two
separate fromare
theinvirtual
contact, the the
hand, auxiliary hand
auxiliary stops
hand re-
movingmainswithatthe
thevirtual
contact hand
spot. and remains relatively stationary in the contact spot, even
Sensors 2023, 23, x FOR PEER REVIEW 33 of 48
when the object isState
(3) Penetration penetrated
(S3): Theby the virtual
auxiliary handhand.
graspsWhen releasing,
the object that isthe auxiliary
being hand
manipulated
continues to move with the virtual
as the virtual one penetrates it. hand. With the release state comes the contactless state,
and thus, the grabbing interaction circulates. The difference in the interactions
(4) Release State (S4): The auxiliary hand moves away with the virtual one, thereby re- of the virtual
hand leasing
and thetheauxiliary handisisbeing
object that illustrated in Figure 28.
manipulated.
If the hand and the manipulated object are not in contact, the auxiliary hand moves
with the virtual hand. However, once the two are in contact, the auxiliary hand stops
moving with the virtual hand and remains relatively stationary in the contact spot, even
when the object is penetrated by the virtual hand. When releasing, the auxiliary hand con-
tinues to move with the virtual hand. With the release state comes the contactless state,
and thus, the grabbing interaction circulates. The difference in the interactions of the vir-
tual hand and the auxiliary hand is illustrated in Figure 28.
Figure28.
Figure 28.The
Thedifference
differencebetween
betweenthe
thecollision
collisionand
andgrabbing
grabbing processes
processes of of
thethe virtual
virtual hand
hand and
and proxy
proxy hand.
hand.
5.2.2.
5.2.2.Realization
RealizationofofStable
StableGrasping
Grasping
The method of relative
The method of relative thresholds waswas
thresholds employed
employedto determine whether
to determine an object
whether an could
object
be grasped, thereby maximizing the realism of actual life. First, we located
could be grasped, thereby maximizing the realism of actual life. First, we located each object’s
each
bounding
Sensors 2023, 23, x FOR PEER REVIEW box that was
object’s bounding box aligned
that waswith the axis;
aligned withconsequently, we determined
the axis; consequently, the face34ofofthe
we determined the
48
bounding
face of thebox that was
bounding boxmost
that proximate to the hand.
was most proximate Byhand.
to the calculating the diagonal
By calculating length
the diagonal
of this face,
length weface,
of this could
wedetermine a real-time
could determine relative relative
a real-time threshold in orderintoorder
threshold map the size the
to map of
the graspable face of the hand onto a threshold range from 0 to 1. This is illustrated
size of the graspable face of the hand onto a threshold range from 0 to 1. This is illustrated in
Figure 29.
in Figure 29.
object
Figure 29.
Figure 29. Diagram
Diagram of
of grab
grab judgement
judgement for
for the
the proxy
proxyhand.
hand.
We assume that x represents the size of the face. Then, x can be mapped from the
object size range [Omin , Omax ] to the threshold range [Tmin , Tmax ] . Therefore, the first
step is computing the location ρ of x within the object size range [Omin , Omax ] , as
Sensors 2023, 23, 7107 22 of 29
We assume that x represents the size of the face. Then, x can be mapped from the
object size range [Omin , Omax ] to the threshold range [ Tmin , Tmax ]. Therefore, the first step
is computing the location ρ of x within the object size range [Omin , Omax ], as shown in
Equation (20):
x − Omin
ρ= (20)
Omax − Omin
where x is the diagonal length of the face, Omin is the diagonal length of the smallest face of
the object, and Omax is the diagonal length of the largest face of the object.
Having obtained ρ, we should then compute the value of x within the threshold range
[ Tmin , Tmax ], which represents the threshold required for the object, as demonstrated below:
where λ is the threshold required to grasp the object, Tmax is the maximum threshold, and
Tmin is the minimum threshold. An object is only grasped when the grasping posture
satisfies the threshold relative to the size of the object, thus ensuring that users can grasp
more naturally and stably.
The stable grasping of an object can also be determined based on a physics-based
method, i.e., by using the calculation of the force during the interaction process as a factor
in assessing the stability of the grasp. From a mechanical and dynamic perspective, the
balance between the force and the torque acting on the manipulated object is one of the
conditions for stable grasping. Suppose that the virtual object and the virtual hand have n
contact points, the normal force at the i-th contact point is set to F (i ), the frictional force is
f (i ), and the size of the gravitational force of the object is Fg . Therefore, the condition for
stable grasping can be expressed by Formula (22):
n n
|∑ F | = ∑ F (i ) + ∑ f (i )+ Fg = 0
i =1 i =1 (22)
n n
| ∑ M | = ∑ F ( i ) × ri + ∑ f ( i ) × ri = 0
i =1 i =1
where ∑ M, ∑ F represent the combined torque and combined force, and ri represents the
radial vector between the contact point and the center of mass of the object.
When the above equilibrium conditions are satisfied, stable grasping can be achieved.
Subsequently, the coordinate system of the virtual object merges with the root coordinate
system of the virtual palm, thus enabling the virtual hand to translate or rotate the object.
(1) Initial finger calibration: The user wore the data glove and performed fist-clenching
and finger-stretching movements five times. The system recorded the range of finger
movement and normalized it, as shown in Figure 31.
Figure30.
Figure 30.Back-end
Back-enddata
datamanagement
managementsoftware.
software.
It is well known that the size and Close shape of each person’s hand vary. To allow
Close
different
users to have a good interactive experience, this data management platform had data cor-
rection functions, such as “initial attitude calibration” and “initial finger calibration”.
Their purpose was to record the range of motion of different users’ fingers and palms and
normalize them, thus enabling the hand motion data of each user to fit the system’s data
format and solving the data matching problem for different users. The specific calibration
methods were as follows.
(1) Initial finger calibration: The user wore the data glove and performed fist-clench-
ing and finger-stretching movements five times. The system recorded the range of finger
Figure31.
Figure 31. Initial
Initial finger
fingercalibration.
calibration.
movement and normalized it, as shown in Figure 31.
(2)
(2) Initial
Initialpalm
palmattitude
attitude calibration:
calibration: While
While wearing
wearing the
the data
data glove,
glove, the
theuser
userfollowed
followed
the
the operation shown in Figure 32. Clicking on “end attitude calibration” completed the
operation shown in Figure 32. Clicking on “end attitude calibration” completed the
hand
handattitude
attitude calibration.
calibration.
It is worth noting that data transmission was a key function of this platform. To
enable the cross-platform transfer of motion data from the data acquisition platform to
the virtual–physical interaction platform, we utilized the User Datagram Protocol (UDP).
UDP sockets facilitate data transfer between programs and support data transmission
between two distinct platforms. For this instance, virtual port numbers were used to send
data to memory addresses, and Unity constructed a listener for data retrieval. Upon the
establishment of a connection, real hand movements could be captured and analyzed,
thereby enabling the mapping of the operator’s hand movements in a virtual environment
by using a virtual hand. The overall data packet of the software was as follows.
Sensors 2023, 23, x FOR PEER REVIEW 38 of 48
Close
Figure 32.
Figure 32. Initial
Initial attitude
attitude calibration.
calibration.
0x55 + 0x53noting
It is worth + RollLthat
+ RollH + PitchL + PitchH
data transmission was a+key YawL + YawH
function + 0x55
of this + 0x53 To
platform. + RollL
ena-
+ble
RollH + PitchL + PitchH + YawL + YawH + 0x55 + 0x53 + RollL + RollH
the cross-platform transfer of motion data from the data acquisition platform to the + PitchL + PitchH
+virtual–physical
YawL + YawH + 0xaa + a1 platform,
interaction + a2 + a3 we + a4utilized
+ a5 (30thebit). TheDatagram
User specific definitions are as
Protocol (UDP).
follows.
UDP sockets facilitate data transfer between programs and support data transmission be-
tween Thistwosoftware
distinctpackaged
platforms.data
For into
this 30 bit datavirtual
instance, packets, which
port included
numbers wereposture
used toangle
send
and finger-bending data. The first 24 bits consisted of data from three
data to memory addresses, and Unity constructed a listener for data retrieval. Upon the posture sensors,
with each contributing
establishment 8 bits of data.
of a connection, The data
real hand frame began
movements couldwith
be the headerand
captured “0x55, 0x53”,
analyzed,
immediately followed by six bits of data that represented the roll,
thereby enabling the mapping of the operator’s hand movements in a virtual environmentpitch, and yaw. Here,
“RollL, RollH, PitchL, PitchH, YawL, YawH” were the raw data from
by using a virtual hand. The overall data packet of the software was as follows. the sensor’s IIC interface
and not0x55the+ direct
0x53 +angle
RollLvalues.
+ RollH The last six+bits
+ PitchL represented
PitchH + YawL finger
+ YawH curvature
+ 0x55 +data,
0x53starting
+ RollL
with the data frame header 0xaa and ending with the curvature of the five fingers.
Sensors 2023, 23, x FOR PEER REVIEW + RollH + PitchL + PitchH + YawL + YawH + 0x55 + 0x53 + RollL + RollH + PitchL + 39PitchH
of 48
+ YawL + YawH + 0xaa + a1 + a2 + a3 + a4 + a5 (30 bit). The specific definitions are as
6.2. Testing the Interactive Virtual Reality System
follows.
The
Thiscommonly encountered
software packaged data“OK”
into and “V-sign”
30 bit gestures
data packets, were included
which used as our experimen-
posture angle
tal targets. The interaction is shown in Figure 33.
and finger-bending data. The first 24 bits consisted of data from three posture sensors,
with each contributing 8 bits of data. The data frame began with the header “0x55, 0x53”,
immediately followed by six bits of data that represented the roll, pitch, and yaw. Here,
Cylinder
Triangular Pyramid
“RollL, RollH, PitchL, PitchH, YawL, YawH” were the raw data from the sensor’s IIC inter-
face and not the direct angle values. The last six bits represented finger curvature data,
Triangular Pyramid
starting with the data frame header 0xaa and ending with the curvature of the five fingers.
Cylinder
AsAscancanbe
beseen
seenininFigure
Figure3434andandTable
Table5,5,the
thevirtual
virtualhand
handwas wasable
abletotoaccurately
accuratelyand and
authenticallyinteract
authentically interactwith
withthe
thehuman
humanhand;hand;the
theprocess
processwaswassmooth
smoothand andcontinuous,
continuous,and and
themovements
the movementsremained
remainedconsistent.
consistent.Figures
Figures5 5and
and6 6validate
validatethis thisprocess;
process;we weobserved
observed
thatthe
that the“OK”
“OK” gesture and andthethe“V-sign”
“V-sign”gesture
gesturewere completed
were completed within threethree
within seconds. Here,
seconds.
a1 –a5a1correspond,
Here, –a5 correspond,respectively, to the
respectively, to middle finger,
the middle littlelittle
finger, finger, ringring
finger, finger, thumb,
finger, thumb,and
index finger. During the movement process,
and index finger. During the movement process, there were there were no significant spikes or
significant spikes or severe severe
jitters,and
jitters, and the
the waveform
waveform aligned
alignedwith withthe
theaction
actionresponse.
response. After
Afterthethe
action was was
action completed,
com-
the movement was sustained for another five seconds. The output results
pleted, the movement was sustained for another five seconds. The output results indicated indicated that the
interaction
that stabilitystability
the interaction also metalso
ourmetexpectations. However,
our expectations. we notedwe
However, a minor
noted jump
a minorin the
jumpred
inwaveform during the
the red waveform “V-sign”
during gesture, which
the “V-sign” gesture, was primarily
which caused by
was primarily the instability
caused by the in-of
the resistance
stability value of the
of the resistance flexible
value sensor
of the when
flexible maintained
sensor in a bent state.
when maintained in a In thestate.
bent filtering
In
the filtering algorithm for the finger-bending degree, the range of this jump did notour
algorithm for the finger-bending degree, the range of this jump did not reach reachset
our set threshold, so it was not processed. However, this result did not affect the interac-
tion, and it returned to the normal range in a short time after the jump occurred.
Sensors 2023, 23, 7107 25 of 29
threshold,
Sensors 2023, 23, x FOR PEER REVIEW so it was not processed. However, this result did not affect the interaction,40
and
of it
48
returned to the normal range in a short time after the jump occurred.
Therefore, from the static interaction experiment, it was found that the motion of the
virtual hand essentially aligned with the human hand’s movement, the jitter phenomenon
was not pronounced, and, overall, it satisfied the requirements for the system.
Curvatura (OK) Curvatura
Curvatura (V) (V)
a1 a1
a2 a2
a3 a3
a4 a4
a5 a5
Time(s) Time(s)
(a) (b)
Figure34.
Figure 34. Output
Output waveforms
waveformsofofstatic
staticgestures. (a) (a)
gestures. Data waveform
Data of gesture
waveform “ok”.“ok”.
of gesture (b) Data
(b) wave-
Data
form of gesture “v”.
waveform of gesture “v”.
Table5.5.Degree
Table Degreeofoffinger
fingerbending
bendingatat10
10s.s.
Based on the
Therefore, interaction
from the staticresults of static
interaction gestures and
experiment, thefound
it was calibration
that the ofmotion
the attitude
of the
sensor
Sensors 2023, 23, x FOR PEER REVIEW
outputs, dynamic grasping experiments were conducted.
virtual hand essentially aligned with the human hand’s movement, In the
41 ofthe
dynamic tests,
48 jitter phenomenon
we
used a geometric sphere as our grasping object. The dynamic grasping
was not pronounced, and, overall, it satisfied the requirements for the system. tests were divided
into three
Basedmovements: object locking,
on the interaction resultscradling, and side holding.
of static gestures The interaction
and the calibration results
of the of
attitude
dynamic grasping
sensor outputs, are shown
dynamic in Figure
grasping 35.
experiments were conducted. In the dynamic tests, we
used a geometric sphere as our grasping object. The dynamic grasping tests were divided
into three movements: object locking, cradling, and side holding. The interaction results
of dynamic grasping are shown in Figure Cylinder
Cylinder 35.
Cone Ball
Ball Cone
Triangular Triangular
pyramid pyramid
Cylinder
Cone
Ball
Triangular
pyramid
a1 a1
a2 a2
a3 a3
a4 a4
a5 a5
a1
a2
a3
a4
a5
Time(s) Time(s)
Time(s)
Yaw Yaw
Yaw
Time(s) Time(s)
Time(s)
Pitch Pitch
Pitch
Figure36.
Figure 36.Waveforms
Waveforms of
of dynamic
dynamic grasping.
grasping.
6.3.
6.3.Debugging
Debugging of
of the
the Overall
Overall System
The
Thetechnologies
technologiesadopted
adoptedforforcommunication
communicationamong amongthethevarious
variousdevices
devices inin
this system
this sys-
included serialserial
tem included communication,
communication, Bluetooth, andand
Bluetooth, Wi-Fi. Firstly,
Wi-Fi. thethe
Firstly, data glove
data and
glove forearm
and fore-
communication
arm communication module transmitted
module the collected
transmitted finger-bending
the collected finger-bending andandforearm
forearm posture
pos-
motion data data
ture motion to our back-end
to our back-enddata management
data managementsoftware
softwareforfor data processing through
data processing through
Bluetooth
Bluetoothand andWi-Fi
Wi-Fi modules,
modules, respectively.
respectively. The data management
management software
softwaresentsentthe
thehand
hand
movement
movementinformation
information to to the
the execution program for virtual
virtual interaction
interactionviaviaaavirtual
virtualserial
serial
Sensors 2023, 23, x FOR PEER REVIEW 43 of 48
port,
port, which mapped to
which mapped toour
ourvirtual
virtualhand.
hand. TheThe control
control board
board of robotic
of the the robotic
hand hand
was con-was
connected to the
nected to the host
host computer
computer viavia a serial
a serial busbus
toto allowthe
allow themotion
motiondatadatatotodrive
drivethe
therobotic
robotic
hand
handto tocomplete
complete thethe corresponding
corresponding hand movements. The The system’s
system’sdatadatacommunication
communication
method
methodisisillustrated
illustrated in
in Figure
Figure 37.
37.
virtual object
Figure 37.
Figure 37. The
The flow
flow of
of system
system data
data transfer.
transfer.
The
The five-finger
five-finger biomimetic
biomimetic robotic
robotic hand
hand after
after the
the assembly
assembly and
and wiring
wiring of
of all
all of
of its
its
components
componentsandandafter
afterbeing
beingpowered
poweredup upisisdepicted
depictedin inFigure
Figure39.
39.After
Aftertesting,
testing,the
therobotic
robotic
hand
hand proved
proved to
to be
bestructurally
structurally stable;
stable; the
the servomotor
servomotor ranran smoothly,
smoothly,and
andthe
thefunctional
functional
requirements
requirementsofofthe
thesystem
systemwere
weremet.met.
Figure 39.39.
Figure Five -fingered bionic
Five-fingered manipulator.
bionic manipulator.
With
With the
themaster–slave
master–slavecontrol
controlsystem
systemand virtual
and interaction
virtual system,
interaction the system
system, ad-
the system
justments
adjustments were completed. The results of the master–slave control of the hand based on
were completed. The results of the master–slave control of the hand based on
virtual
virtual reality
reality are
are illustrated
illustrated in
in Figure
Figure 40.
40.
Figure
Figure 40.
40. Virtual
Virtualreality
reality master–slave interactions
master–slave interactions using a hand.
7. Conclusions
7. Conclusions
This study
This study implemented
implemented aa master–slave
master–slave control
control system
system using
using hands
hands based
based onon virtual
virtual
reality technology. By integrating the virtual reality technology with master–slave
reality technology. By integrating the virtual reality technology with master–slave control control
technology, itit was
technology, waspossible
possibletotocapture
capturethe
theposition,
position,direction,
direction,and
andfinger
finger joint
joint angles
angles of of
a
user’s hand in real time. The system allowed for the coordination of a human hand, aa
a user’s hand in real time. The system allowed for the coordination of a human hand,
mechanicalhand,
mechanical hand,and
andaavirtual
virtualhand,
hand,and
andititwas
wastested
testedfor
forits
itsstability
stabilityand
andinteractivity.
interactivity.
The experimental results demonstrated that the system performed
The experimental results demonstrated that the system performed excellently inexcellently indynamic
dynamic
tracking and
tracking and can
can provide
provide users
users with
with aa natural
natural and
and agile
agile virtual
virtual interaction
interaction experience.
experience.
This study accomplished the following tasks.
This study accomplished the following tasks.
(1) Due
(1) Duetotothe
theurgent
urgentneed
needfor
forhand
handprotection
protectionand
andthe
thecommon
commonproblems
problems with
with existing
existing
hand master–slave control technologies, we combined virtual reality technology
hand master–slave control technologies, we combined virtual reality technology to to
propose a design scheme for a master–slave control system using hands
propose a design scheme for a master–slave control system using hands based on based on vir-
tual reality
virtual technology.
reality We We
technology. analyzed and identified
analyzed the four
and identified theimportant components
four important compo- of
the system,
nents of the namely,
system,the hand data
namely, collection
the hand dataplatform,
collectionthe back-endthe
platform, data management
back-end data
platform, the Unity3D virtual–physical interaction platform, and the master–slave
management platform, the Unity3D virtual–physical interaction platform, and the
master–slave control platform. The working principles and design schemes of each
main part were explained in detail.
(2) In line with the overall design scheme and requirements, a detailed design and ex-
planation of the system’s hardware structure and constituent components were pro-
vided. The hardware included a data glove and a five-fingered bionic mechanical
Sensors 2023, 23, 7107 28 of 29
control platform. The working principles and design schemes of each main part were
explained in detail.
(2) In line with the overall design scheme and requirements, a detailed design and
explanation of the system’s hardware structure and constituent components were
provided. The hardware included a data glove and a five-fingered bionic mechanical
hand, while the software part involved data management software and a virtual
interaction program, which eventually resulted in the realization of the debugging
and operation of the overall system.
(3) In terms of hand posture calculation, we designed a data analysis model for the
finger-bending degree and palm posture angles. In terms of data processing, we
proposed the integration of complementary filtering based on Kalman filtering, thus
fully exploiting the advantages of the two algorithms and compensating for their
shortcomings.
(4) In research on virtual–real interactions of the hand, we proposed a proxy hand
solution, which solved the problem of mutual penetration between the virtual hand
and virtual objects during the virtual interaction process. For the judgement of
stable grasping, two judgement conditions were proposed, which addressed the
non-immersive experience brought by the lack of physical constraints in the virtual
world.
Although the system achieved the expected design goals, virtual reality technology
involves complex interaction processes, and there are many factors affecting the interaction
experience. This study mainly judged the stability of an interaction process based on the
forces, speed, and depth of collision between rigid bodies. However, an analysis of other
physical constraints between non-rigid bodies was not performed. Therefore, in future
research, it will be necessary to continuously expand and conduct in-depth research on this
system.
Author Contributions: Conceptualization, S.L. and C.S.; methodology, S.L.; software, S.L.; validation,
S.L. and C.S.; formal analysis, S.L.; investigation, S.L. and C.S.; resources, S.L. and C.S.; data curation,
S.L. and C.S.; writing—original draft preparation, S.L.; writing—review and editing, S.L. and C.S.;
visualization, S.L.; supervision, C.S.; project administration, S.L. and C.S.; funding acquisition, C.S.
All authors have read and agreed to the published version of the manuscript.
Funding: This research received no external funding.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Tsai, Y.T.; Jhu, W.Y.; Chen, C.C.; Kao, C.H.; Chen, C.Y. Unity game engine: Interactive software design using digital glove for
virtual reality baseball pitch training. Microsyst. Technol. 2019, 24, 1401–1417. [CrossRef]
2. Jha, C.K.; Gajapure, K.; Chakraborty, A.L. Design and evaluation of an FBG sensor-based glove to simultaneously monitor flexure
of ten finger joints. IEEE Sens. J. 2020, 21, 7620–7630. [CrossRef]
3. Kim, H.; Choi, Y. Performance Comparison of User Interface Devices for Controlling Mining Software in Virtual Reality
Environments. Appl. Sci. 2019, 9, 2584. [CrossRef]
4. Xiao, W.Y.; Zhi, C.Z.; Li, Q.K.; Xie, H. The Key technologies of human-computer interaction based on virtual hand. Comput. Appl.
2015, 35, 2945–2949.
5. Mapes, D.P.; Moshell, J.M. A two-handed interface for object manipulation in virtual environments. Presence Teleoperators Virtual
Environ. 1995, 4, 403–416. [CrossRef]
6. Medeiros, D.; Carvalho, F.; Teixeira, L.; Braz, P.; Raposo, A.; Santos, I. Proposal and evaluation of a tablet-based tool for 3D virtual
environments. SBC J. Interact. Syst. 2013, 4, 30–42. [CrossRef]
7. Nasim, K.; Kim, Y.J. Physics-based assistive grasping for robust object manipulation in virtual reality. Comput. Animat. Virtual
Worlds 2018, 29, e1820. [CrossRef]
8. Li, Y.J.; Xue, S. Optical fiber data glove for hand posture capture. Opt.-Int. J. Light Electron Opt. 2021, 233, 166603. [CrossRef]
Sensors 2023, 23, 7107 29 of 29
9. Fang, B.; Sun, F.; Liu, H.; Guo, D. A novel data glove using inertial and magnetic sensors for motion capture and robotic arm-hand
teleoperation. Ind. Robot. Int. J. 2017, 44, 155–165. [CrossRef]
10. Wan, K.; Aziz, A.A.; Ab, S.; Zaaba, S.K.; Ibrahim, Z.; Yusof, Z.M.; Ibrahim, I.; Mukred, J.A.; Mokhtar, N. Probability Distribution
of Arm Trajectory for Motion Estimation and Gesture Recognition. Adv. Sci. Lett. 2012, 13, 534–539. [CrossRef]
11. Ahmad, A.; Migniot, C.; Dipanda, A. Hand Pose Estimation and Tracking in Real and Virtual Interaction: A Review. Image Vis.
Comput. 2019, 89, 35–49. [CrossRef]
12. Saggio, G. A novel array of flex sensors for a goniometric glove. Sens. Actuators A Phys. 2014, 205, 119–125. [CrossRef]
13. Yang, W.K. Research on Attitude Recognition and Robot Tracking Control System. Master’s Thesis, Jilin University, Changchun,
China, 2021.
14. Lee, S.Y.; Bak, S.H.; Bae, J.H. An Effective Recognition Method of the Gripping Motion Using a Data Gloves in a Virtual Reality
Space. J. Digit. Contents Soc. 2021, 22, 437–443. [CrossRef]
15. Mitra, S.; Acharya, T. Gesture recognition: A survey. IEEE Trans. Syst. Man Cybern. 2007, 37, 311–324. [CrossRef]
16. Li, Y.; Zhang, Y.; Ye, X.; Zhang, S. Haptic rendering method based on generalized penetration depth computation. Signal Process.
2016, 120, 714–720. [CrossRef]
17. Li, X.; Huang, Q.; Yu, Z.; Zhu, J. A novel under-actuated bionic hand and its grasping stability analysis. Adv. Mech. Eng. 2017, 9,
1687814016688859. [CrossRef]
18. Shamaie, A.; Sutherland, A. Hand tracking in bimanual movements. Image Vis. Comput. 2005, 23, 1131–1149. [CrossRef]
19. Miller, A.T.; Allen, P.K. Graspit! a versatile simulator for robotic grasping. IEEE Robot. Autom. Mag. 2004, 11, 110–122. [CrossRef]
20. Zhang, Z.Y. Design of Motion Capture System Based on MEMS Inertial Measurement Units. Autom. Panor. 2017, 277, 85–88.
21. Ceolini, E.; Frenkel, C.; Shrestha, S.B.; Taverni, G.; Donati, E. Hand-gesture recognition based on EMG and event-based camera
sensor fusion: A benchmark in neuromorphic computing. Front. Neurosci. 2020, 14, 637–641. [CrossRef]
22. Song, P.; Fu, Z.; Liu, L. Grasp planning via hand-object geometric fitting. Vis. Computer 2018, 34, 257–270. [CrossRef]
23. Tian, H.; Wang, C.; Manocha, D.; Zhang, X.Y. Realtime hand-object interaction using learned grasp space for virtual environments.
IEEE Trans. Vis. Comput. Graph. 2018, 25, 2623–2635. [CrossRef] [PubMed]
24. Silva, L.; Dantas, R.; Pantoja, A. Development of a low cost dataglove based on arduino for virtual reality applications. In
Proceedings of the 2013 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement
Systems and Applications (CIVEMSA), Milan, Italy, 15–17 July 2013.
25. Cha, Y.; Seo, J.; Kim, J.S.; Park, J.M. Human–computer interface glove using flexible piezoelectric sensors. Smart Mater. Struct.
2017, 26, 057002. [CrossRef]
26. Hilman, M.; Basuki, D.K.; Sukaridhoto, S. Virtual hand: VR hand controller using IMU and flex sensor. In Proceedings of the
International Electronics Symposium on Knowledge Creation and Intelligent Computing (IES-KCIC), Roma, Italy, 29–30 October
2018.
27. Oprea, S.; Martinez-Gonzalez, P.; Garcia-Garcia, A.; Castro-Vargas, J.A.; Garcia-Rodriguez, J. A visually realistic grasping system
for object manipulation and interaction in virtual reality environments. Computer Graph. 2019, 83, 77–86. [CrossRef]
28. Huang, X.N.; Wang, Q.; Zang, S.Y.; Wan, J.X.; Ren, X.M. Tracing the Motion of Finger Joints for Gesture Recognition via Sewing
RGO-Coated Fibers onto a Textile Glove. IEEE Sens. J. 2019, 19, 9504–9511. [CrossRef]
29. Yan, P.Z.; Jing, L.; Ran, Y.; Lei, H.; Jin, L.C.; Jia, N.C. Attitude Solving Algorithm and FPGA Implementation of Four-Rotor UAV
Based on Improved Mahony Complementary Filter. Sensors 2022, 22, 6411.
30. Yang, Z.; Yan, S.; van Beijnum, B.J.; Li, B.; Veltink, P.H. Hand-Finger Pose Estimation Using Inertial Sensors. Magn. Sens. A Magnet.
2021, 21, 18115–18122.
31. Maitre, J.; Rendu, C.; Bouchard, K.; Bouchard, B.; Gaboury, S. Object recognition in performed basic daily activities with a
handcrafted data glove prototype. Pattern Recognit. Lett. 2021, 147, 181–188. [CrossRef]
32. Sim, D.; Baek, Y.; Cho, M. Low-Latency Haptic Open Glove for Immersive Virtual Reality Interaction. Sensors 2021, 21, 3682.
[CrossRef]
33. Wen, Z.W.; Hua, X. Design and Implementation of Collision Detection System in Virtual Simulation. J. Beijing Inst. Petro-Chem.
Technol. 2017, 25, 48–52.
34. Ling, Z. Research and Application of Hybrid Bounding Box Collision Detection Algorithm in Virtual Environment. Master’s
Thesis, Hangzhou DianZi University, Hangzhou, China, 2022.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.