0% found this document useful (0 votes)
28 views4 pages

6.pressure Sensor Positioning For Accurate Human Interaction With A Robotic Hand

The document discusses optimizing the positioning of pressure sensors on a robotic hand to improve gesture recognition accuracy. It analyzes data from 22 sensors placed at different locations on a robotic hand simulating 10 distinct gestures. Various machine learning algorithms were used to create a correlation matrix to determine the most relevant sensor positions. The results showed that reducing the number of sensors to 12 still allowed differentiating the 10 gestures with 99% accuracy, highlighting the importance of sensor optimization.

Uploaded by

rajmeet singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views4 pages

6.pressure Sensor Positioning For Accurate Human Interaction With A Robotic Hand

The document discusses optimizing the positioning of pressure sensors on a robotic hand to improve gesture recognition accuracy. It analyzes data from 22 sensors placed at different locations on a robotic hand simulating 10 distinct gestures. Various machine learning algorithms were used to create a correlation matrix to determine the most relevant sensor positions. The results showed that reducing the number of sensors to 12 still allowed differentiating the 10 gestures with 99% accuracy, highlighting the importance of sensor optimization.

Uploaded by

rajmeet singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Pressure Sensor Positioning for Accurate Human

Interaction with a Robotic Hand


Masoud Akhshik 1, Saeed Mozaffari1, Rajmeet Singh1, Simon Rondeau-Gagné2, Shahpour Alirezaee1
1
Mechanical, Automotive, and Material Engineering Department. University of Windsor, Windsor, Canada
2
Department of Chemistry and Biochemistry, University of Windsor, Windsor, Canada
2023 International Symposium on Signals, Circuits and Systems (ISSCS) | 979-8-3503-4203-1/23/$31.00 ©2023 IEEE | DOI: 10.1109/ISSCS58449.2023.10190966

{akhshik, saeed.mozaffari, rsbhourji, simon.rondeau-gagne, s.alirezaee} @uwindsor.ca

Abstract— Sensor positioning involves determining the best recognition, different body markers have been attached to the
location for a sensor to be placed or installed so that it can user’s hand [6]. Although vision-based techniques have high
effectively sense or measure the desired physical or environmental recognition performance, they suffer from line-of-sight
parameters. In this paper, we present a novel approach for occlusions, high power consumptions, and they are sensitive to
finding sensor positions on a robotic hand using machine learning
lighting conditions, background noise, and other factors that can
techniques. We focus on pressure sensors and their placement in
order to enhance the performance and reliability of gesture impact the quality of the visual data being processed [6]. To
recognition tasks. Our study analyzes data from 22 sensors placed address these problems, sensor-based techniques have been
at different locations on a right-handed robotic hand, simulating proposed. On the negative side, these sensors may decrease
10 distinct hand gestures. We employ various machine learning dexterity of the hands and reduce physical interactions.
algorithms and create a correlation matrix to determine the most Accelerometer and gyroscope sensors [7], IMU (inertial
relevant sensor positions. The results highlight the significance of measurement unit) sensor [8] used for hand gestures
sensor optimization in improving the overall efficiency and recognition. For complex gestures recognition, data-glove based
effectiveness of robotic hand systems. By reducing the number of techniques have been proposed [9]. A data glove consists of an
sensors to 12, we were still differentiate 10 hand gestures with
array of sensors which can be worn on the user's hand [10].
%99 accuracy.
electromyography (EMG) sensor can detect neuronal electrical
potentials to recognize muscle activity [11] Pressure sensors
I. INTRODUCTION have been used frequently for muscle activity recognition [12].
The human-machine interaction (HMI) is known as the The HMIs rely heavily on sensors to collect data and provide
communication and collaboration between humans and accurate and up-to-date information to operators, allowing them
machines, typically through the use of technology such to monitor and control industrial processes in real-time. After
touchscreens, voice commands, gesture recognition, and other collecting the input signals, they are transformed into a
intuitive interfaces [1]. With the continued advancements in command and sent to the machine system to execute them. In
artificial intelligence and machine learning, humans can interact human-robot interaction, with the tactile and force sensors, it
with sophisticated machines such as robots. HMI enables robots becomes possible to collect biological signals, including body
to perform tasks that require human-like dexterity, such as movements. Resistive sensors, capacitive sensors, piezoelectric
advanced manufacturing, fine surgery, and even uncharted and sensors are among the commonly utilized varieties of tactile
space exploration [2]. A robotic hand, for example, is a and force sensors [13]. Each of these sensors has their own
mechanical device that mimics the function of a human hand. It merits and drawbacks in terms of sensitivity, accuracy, and
typically consists of a series of interconnected joints and response time [14].
actuators that allow it to grasp and manipulate objects in a way Although sensor-based techniques have high sensitivity, they
that resembles the movement of a human hand. Robotic hands need to be poisoned accurately to have a precise hand gesture
can be used for various purposes beyond prosthetics and recognition system. This is mainly due to fact that, the human
industrial reasons, and it could expand to social and humanoid hand is a complex system and interacting with it is a
robotics applications [3]. challenging task. The human hand contains 27 bones, controlled
Taking interaction between human and robotic hands into by more than 30 muscles and 20 identified muscular branches
consideration, various methods have been suggested which can [15]. The aim of this paper is to develop a human-machine
generally be divided into two primary categories: vision-based interaction system to control a robotic through muscles signals,
techniques and sensor-based techniques [4]. Vision-based using pressure sensors. To build an accurate HMI system, we
techniques rely on processing visual data from cameras to should find the optimum location of sensors on the human hand.
recognize and interpret hand gestures [5]. The vision system can Determining the optimal sensor positions can lead to
consist of a single camera, including webcam and smart-phone maximizing efficiency and reliability [1-3]. In this study, we
camera, or stereo-camera to obtain the depth information analyze data from 22 pressure sensors placed on a right-handed
through two simultaneous images. Some researchers captured robotic hand, simulating 10 different hand gestures. By
3D structure of the hand via light coding techniques such as employing various machine learning algorithms, we create a
Microsoft Kinect, etc. To improve accuracy of the gesture heatmap/correlation matrix to identify the most relevant sensor

979-8-3503-4203-1/23/$31.00 ©2023 IEEE


Authorized licensed use limited to: the Leddy Library at the University of Windsor. Downloaded on August 30,2023 at 00:59:14 UTC from IEEE Xplore. Restrictions apply.
positions, thereby optimizing the sensor placement for gesture
recognition tasks.
II. SYSTEM ARCHITECTURE
The proposed testing gestures for the human-robotic hand
interactions are shown in Figure.1. It consists of a glove
equipped with sensors to capture gestures, a robotic hand, and a
communication application.
A. Robotic Hand
The robotic hand used in this study is a state-of-the-art, open-
source mechanical hand that features advanced LFD-01 servos
for controlling each finger, as well as a powerful LD-1501 servo
that enables the hand to mimic the twists of a human wrist
(Figure 1). The hand's rigid structure helps us to be able to
measure better and improve our precision while allowing for a Figure 2. The hand gestures used for this experience.
wide range of movements. The system is using an STM32
microcontroller that can be easily programmed and customized III. HAND GESTURE RECOGNITION
using a computer or an android smartphone. Due to the The sensors that have been placed on the hand can be seen in
similarities to actual human hands we find this robotic hand to Figure 3. We have also implemented Machine learning
be an ideal medium for exploring and developing new human- algorithms to predict the class of hand movement based on the
robot interaction applications. information from all sensors.

B. Gesture Set A. Data Collection


The hand gestures were selected based on their relevance to To capture detailed information about the hand movements
human-computer interaction and their potential applicability in and gestures, data was collected from the kinematic hand
a range of real-world scenarios. Additionally, the gestures were movement database [16] for the right hand. Specifically, the
designed to be replicable by our selected robotic hand. The set dataset includes information from 22 pressure sensors (as shown
of 10 hand gestures included in the study represents a diverse in Figure 3), which were strategically placed at various
range of movements, from basic wrist rotation to more intricate locations on the hand to capture tactile information during the
tripod grasp. These gestures include common tasks such as performance of different hand gestures. The data was collected
grasping, pinching, pointing, and more. Figure 2 provides a with the aim of representing a wide range of interactions and
visual representation of the 10 hand gestures that were used in movements, and thus includes examples of simple and complex
this experiment. gestures alike. By leveraging this rich dataset, we can gain
valuable insights into the mechanics of human hand movements
and inform the development of more advanced robotic systems
that can mimic these movements with greater fidelity.
B. Machine Learning Techniques
For classifying hand gestures, we employed different machine
learning algorithms in this paper [17]. These algorithms
included multiclass logistic regressions, multiclass neural
networks, random forests, light gradient boosting, and boosted
decision trees. Each algorithm was evaluated based on its ability

Figure 1. Open-source robotic hand


Figure 3. Position of the pressure sensors on the hand

Authorized licensed use limited to: the Leddy Library at the University of Windsor. Downloaded on August 30,2023 at 00:59:14 UTC from IEEE Xplore. Restrictions apply.
to correctly classify the hand gestures using the sensor data. However, when it comes to precision, the boosted decision
The performance of each algorithm was determined by overall tree algorithm stands out as the clear winner, surpassing even
accuracy, which takes into account both macro and micro the most optimistic expectations. On the other hand, while
precision and recall. linear regression may have produced relatively weaker results,
it should not be overlooked as it could still prove to be a
IV. EXPERIMENTAL RESULTS valuable tool in certain contexts.
In this part, we first classified gestures using the data from all Our results demonstrate that the machine learning algorithms
22 sensors, and then the same experiment was performed with we employed were able to accurately classify most hand
only the top 12 sensor as it will be explained. gestures, with a high degree of precision and recall. This can be
A. Evaluation Metrics contributed to the many sensors installed on the human hand,
and limited number of actions which are different. In reality
The performance of robotic hands in recognizing hand when multiple gestures are performed simultaneously or where
gestures can be assessed based on various metrics, such as environmental factors impact sensor data, gesture recognition
accuracy, precision, and recall. These criteria can be calculated would be more challenging.
based on false positive (fp), false negative (fn), true negative
(tn), and true positive (tp) counts respectively (Table 1).
C. corelation matrix
Table 1. Evaluation equations Through analyzing the correlation matrix, we can obtain
valuable information about the relationships between sensors.
Precisionµ The values in the correlation matrix range from -1 to 1, with -1
representing a full negative correlation, 0 demonstrating no
PrecisionM
correlation, and 1 showing a full positive correlation. Figure 4
shows the correlation matrix for the thumb up gesture. We
hypothesized that sensors with correlation values near zero
Recallµ which indicates independence are more important than highly
correlated sensors. Reducing the number of sensors will reduce
RecallM the feature selection time.

Overall Accuracy

B. Gesture Recognition Performance


Accurate and efficient gesture recognition is crucial for
enabling effective human-robot interaction. In order to evaluate
the performance of our system, we performed experiments
using a dataset of ten distinct hand gestures (Figure 1). Table 2
shows the results of the top five models, each evaluated for
their overall accuracy in predicting hand gesture classes.
Impressively, all five models exhibit exceptional performance.

Table 2. Evaluation of the machine learning model performance in the


classification of the hand gestures using all sensors data.

Overall Micro Macro Micro Macro


Algorithm
Accuracy Precision Precision Recall Recall
Multiclass
Neural 0.999877 0.999877 0.999876 0.999877 0.999875
Network
Multiclass
Decision 0.999953 0.999953 0.999952 0.999953 0.999952
Forest
Multiclass
Linear 0.963821 0.963821 0.96364 0.963821 0.96361
Regression
Boosted
Decision 0.999999 0.999999 0.999999 0.999999 0.999999
Tree
Light
Gradient 0.99998 0.999999 0.999999 0.99998 0.99998
Figure 4. Top) Correlation matrix for the thumbs up gesture.
Boosting
Bottom) Correlation matrix for the pinch grasp gesture.

979-8-3503-4203-1/23/$31.00 ©2023 IEEE


Authorized licensed use limited to: the Leddy Library at the University of Windsor. Downloaded on August 30,2023 at 00:59:14 UTC from IEEE Xplore. Restrictions apply.
REFERENCES
[1] E. Prati, M. Peruzzini, M. Pellicciari, and R. Raffaeli, “How to include
user experience in the design of human-robot interaction,” Robotics and
Computer-Integrated Manufacturing, vol. 68, p. 102072, 2021.
[2] S. M, K. Venusamy, S. S, S. S and N. K. O, "A Comprehensive Review
of Haptic Gloves: Advances, Challenges, and Future Directions," 2023
Second International Conference on Electronics and Renewable Systems
(ICEARS), Tuticorin, India, 2023, pp. 227-233, doi:
10.1109/ICEARS56392.2023.10085607.
[3] A. Saudabayev and H. A. Varol, "Sensors for Robotic Hands: A Survey
of State of the Art," in IEEE Access, vol. 3, pp. 1765-1782, 2015, doi:
10.1109/ACCESS.2015.2482543.
[4] X. Jiang, L. -K. Merhi and C. Menon, "Force Exertion Affects Grasp
Classification Using Force Myography," in IEEE Transactions on
Figure 5. The evaluation of models based on number of the sensors. Human-Machine Systems, vol. 48, no. 2, pp. 219-226, April 2018, doi:
10.1109/THMS.2017.2693245.
D. Sensor selection [5] Z. Xia et al., "Vision-Based Hand Gesture Recognition for Human-Robot
After filtering similar sensors, we select the most important Collaboration: A Survey," 2019 5th International Conference on Control,
Automation and Robotics (ICCAR), Beijing, China, 2019, pp. 198-205,
ones through evaluating the individual sensor importance in a doi: 10.1109/ICCAR.2019.8813509.
machine learning model. To this end, we randomly permute the [6] D. Sarma and M. K. Bhuyan, “Methods, databases and recent
values of a single sensor in the dataset and measure the resulting advancement of vision-based hand gesture recognition for HCI systems:
change in the model's accuracy, precision, and recall. If a sensor A Review,” SN Computer Science, vol. 2, no. 6, 2021.
is important for the model's performance, then permuting its [7] H. P. Gupta, H. S. Chudgar, S. Mukherjee, T. Dutta and K. Sharma, "A
Continuous Hand Gestures Recognition Technique for Human-Machine
values will result in a significant decrease in performance. Interaction Using Accelerometer and Gyroscope Sensors," in IEEE
Based on these lists of the importance of the sensors, after Sensors Journal, vol. 16, no. 16, pp. 6425-6432, Aug.15, 2016, doi:
performing the machine learning experiences with the same 10.1109/JSEN.2016.2581023.
models, we have plotted the accuracy and performance of the [8] C. Xu, P. H. Pathak, and P. Mohapatra, “Finger-writing with
models. Smartwatch,” Proceedings of the 16th International Workshop on Mobile
Computing Systems and Applications, 2015.
In this study, we have conducted tests using a reduced set of
[9] L. Guo, Z. Lu and L. Yao, "Human-Machine Interaction Sensing
sensors and the results are presented in Figure 5. It is observed Technology Based on Hand Gesture Recognition: A Review," in IEEE
that increasing the number of sensors improves the overall Transactions on Human-Machine Systems, vol. 51, no. 4, pp. 300-309,
accuracy of the model. Interestingly, the Multiclass Boosted Aug. 2021, doi: 10.1109/THMS.2021.3086003.
Decision Tree classifier was able to achieve the highest [10] D. W. O. Antillon, C. R. Walker, S. Rosset and I. A. Anderson, "Glove-
prediction accuracy with only 5 sensors. In general, with 12 Based Hand Gesture Recognition for Diver Communication," in IEEE
Transactions on Neural Networks and Learning Systems, doi:
sensors, most models exhibited %99 classification accuracy. 10.1109/TNNLS.2022.3161682.
[11] J. Zhang, B. Wang, C. Zhang, Y. Xiao, and M. Y. Wang, “An
V. CONCLUSION EEG/EMG/EOG-based Multimodal Human-machine interface to real-
In conclusion, our study has shown the significant potential time control of a soft robot hand,” Frontiers in Neurorobotics, vol. 13,
2019.
of machine learning algorithms in finding optimum sensor
[12] L. Yan, Y. Mi, Y. Lu, Q. Qin, X. Wang, J. Meng, F. Liu, N. Wang, and
positions on a robotic hand for gesture recognition tasks. By X. Cao, “Weaved piezoresistive triboelectric nanogenerator for human
analyzing pressure sensor data, we have identified key sensor motion monitoring and gesture recognition,” Nano Energy, vol. 96, p.
positions that are highly correlated with specific hand gestures 107135, 2022.
and were able to reduce the number of sensors to half and still [13] A. Saudabayev and H. A. Varol, "Sensors for Robotic Hands: A Survey
be able to impressively predict the hand gestures. of State of the Art," in IEEE Access, vol. 3, pp. 1765-1782, 2015, doi:
10.1109/ACCESS.2015.2482543.
It is highly recommended that further research explores the
[14] J. Xu, J. Pan, T. Cui, S. Zhang, Y. Yang, and T.-L. Ren, “Recent Progress
development of adaptive sensor placement strategies that can of Tactile and Force Sensors for Human–Machine Interaction,” Sensors,
dynamically adjust sensor positions based on the task at hand. vol. 23, no. 4, p. 1868, Feb. 2023, doi: 10.3390/s23041868.
Such advancements could bring us even closer to creating [15] R. J. Schwarz and C. Taylor, ‘The anatomy and mechanics of the human
robotic systems that replicate the dexterity and versatility of the hand’, Artificial limbs, vol. 2, no. 2, pp. 22–35, 1955.
human hand, opening up new possibilities for human-robot [16] N. Jarque-Bou, A. Manfredo, and H. Müller, “Calibrated kinematic
Ninapro hand movements data.” Zenodo, 2019, doi:
collaboration in new horizons. 10.5281/zenodo.3480074.
ACKNOWLEDGMENT [17] K. P. Murphy, Probabilistic machine learning: an introduction. MIT
press, 2022.
The authors would like to express their sincere gratitude to
the Faculty of Engineering at the University of Windsor for
financial support through the Innovating Sustainability grant.

Authorized licensed use limited to: the Leddy Library at the University of Windsor. Downloaded on August 30,2023 at 00:59:14 UTC from IEEE Xplore. Restrictions apply.

You might also like