0% found this document useful (0 votes)
40 views6 pages

Navigation For Visually Impaired

This document summarizes a research paper that proposes a wearable navigation system for visually impaired people using RFID technology. The system includes a smart glove and shoe that can detect RFID tags on surrounding objects and floors. As the user navigates, the shoe provides audio feedback guidance through headphones based on the detected tags and a navigation algorithm. An experiment was conducted to evaluate the reliability, power efficiency, and accuracy of the proposed system for indoor navigation assistance. The results showed the platform meets requirements for a practical navigation aid for the visually impaired.

Uploaded by

gab linggayo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views6 pages

Navigation For Visually Impaired

This document summarizes a research paper that proposes a wearable navigation system for visually impaired people using RFID technology. The system includes a smart glove and shoe that can detect RFID tags on surrounding objects and floors. As the user navigates, the shoe provides audio feedback guidance through headphones based on the detected tags and a navigation algorithm. An experiment was conducted to evaluate the reliability, power efficiency, and accuracy of the proposed system for indoor navigation assistance. The results showed the platform meets requirements for a practical navigation aid for the visually impaired.

Uploaded by

gab linggayo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Preprint version of the manuscript published in the proceedings of the 10th RSI International Conference on Robotics and Mechatronics

(ICRoM 2022), Nov. 15-18, 2022, Tehran, Iran. DOI: 10.1109/ICRoM57054.2022.10025351.

A Wearable RFID-Based Navigation System


for the Visually Impaired
Fateme Zare Paniz Sedighi
Faculty of Electrical Engineering Faculty of Electrical and Computer Engineering
K. N. Toosi University of Technology University of Alberta
Tehran, Iran [email protected] Alberta, Canada
[email protected]
Mehdi Delrobaei
Faculty of Electrical Engineering
K. N. Toosi University of Technology
Tehran, Iran
[email protected]

Abstract—Recent studies have focused on developing Inventions such as electronic travel aids (ETAs) help them to
advanced assistive devices to help blind or visually improve their mobility. Although various assistive navigation
impaired people. Navigation is challenging for this devices are available, they are less popular among the visually
community; however, devel- oping a simple yet reliable impaired [2]. This fact indicates that further research should
navigation system is still an unmet need. This study be conducted to enhance the capability and usability of
targets the navigation problem and proposes a assistive devices.
wearable assistive system. We developed a smart glove For this purpose, a practical design must be able to meet
and shoe set based on radio-frequency identification
the following specifications: (1) simple and low-cost
technology to assist visually impaired people with
navigation and orientation in indoor environments. construction,
The system enables the user to find the directions (2) lightweight, (3) reliable user interface, (4) power-
through audio feedback. To evaluate the device’s efficient, and (5) acceptable accuracy.
performance, we designed a simple experimental setup. In this work, we propose a system that includes a glove and
The proposed system has a simple structure and can be a shoe based on radio-frequency identification (RFID) to meet
personalized according to the user’s requirements. The all these requirements. The shoe and the glove provide non-
results identified that the platform is reliable, power contact data transfer between their transponders and the
efficient, and accurate enough for indoor navigation. RFID passive tags, either on the floor or the surrounding
Index Terms—Radio Frequency Identification, assistive objects. The user is then guided by a set of audio feedback
tech- nology, wearable devices, human-machine generated based on a navigation algorithm and audio files
interaction saved on an onboard database.
This paper is organized as follows. Section II surveys related
I. INTRODUCTION work, discussing the advantages and limitations of existing
A recent global report by WHO indicates that at least 2.2 solutions. Section III describes the design and development of
billion people struggle with vision impairment or blindness our proposed system. Section IV demonstrates the designed
[1]. Visual impairment can be prevented in at least 50% of experiment and reports experimental results as well as the
these cases. Also, the majority of vision impairment in low- limitations of the system. Section V concludes the paper and
and middle-income regions is four times higher than in high- proposes a future perspective.
income regions.
II. RELATED WORK
The visually impaired are often challenged with interacting
with the surroundings, such as navigating unfamiliar environ- Current assistive devices employ different technologies to
ments. As a result, they have lower workforce participation help the visually impaired. We categorized the solutions into
rates and may experience more inconvenience. four groups: RGB-D camera-based, ultrasonic and Infrared-
The white cane is commonly used by the blind and visually based, GPS-based, and RFID-based methods.
impaired. White canes are affordable and help the users to
detect obstacles up to the knee level. However, they are not
A. RGB-D camera-based methods
efficient in providing enough information about the environ- Barontini et al. [3] proposed a system based on wearable
ment. haptic technologies to help the visually impaired navigate and
Assistive devices can enhance the quality of life of the detect obstacles in indoor environments. The system consisted
visually impaired. Such devices may enable them to nav- of an RGB-D camera, a microcontroller, and a wearable
igate independently, detect obstacles, and identify objects. device. However, the device’s usability seems questionable,
Assistive technologies are also effective for rehabilitation, and the camera may not be functional in dark environments.
which improves the functioning of the visually impaired.

572
Bai et al. [4] demonstrated a system including an RGB- Bruno Ando [20] presented a measurement strategy to assess
D camera, a smartphone, IMU sensors, and earphones to an RFID-based navigation aid. The proposed system consisted
receive the audio feedback. The device was designed for of UHF RFID tags, an RFID reader, an antenna, and a
object recognition and localization in indoor and outdoor microcontroller. They provided an index to evaluate the RFID
environments, but only indoor applications were achieved. transponder’s sensitivity and measure its performance.
Aladre´n et al. [5] designed a simple RGB-D device for S. Alghamdi et al. [21] conducted two case studies to
navigation and obstacle detection in indoor environments. evaluate a combined technique of power attenuation and a
They used sound map information and voice comments to received signal strength indicator using RFID. This technique
guide the user. was designed for both outdoor and indoor environments.
Several researchers [6], [7] developed devices based on However, their system had to be used with the combination of
RGB-D cameras for object or face recognition. Although using a white cane.
machine vision algorithms, RGB-D cameras, stereoscopic, and The poor user interface, functional complexity, inappro-
binocular sensors provides a more accurate explanation of priate weight and size, and high price are reasons for the
the environment, they need complex calculations and are low popularity of the available assistive devices. Also, lack
expensive [8], [9]. of a system that supports a 3D user interaction with the
environment motivated us to address the limitations of
B. Ultrasound and Infrared-based methods
existing navigating systems.
Infrared (IR) based techniques are the most common posi- We developed a system consisting of an assistive glove and
tioning systems due to the availability of IR technology for a shoe. The user navigates with the help of wireless communi-
various gadgets. An IR-based positioning system needs a line cation between passive RFID tags and the transponders
of sight link between the transmitter and receiver without under the shoe and on the glove. The system has a user
any interference. The benefits of this technology are its small interface, including a keypad (input) and a set of headphones
size and lightweight. The major limitations are the short (outputs). The shoe mainly provides audio feedback to assist
range and maintenance cost [10]. the visually impaired in navigating.
NavGuide is an assistive device to aid visually impaired
people in detecting obstacles. The device consisted of a low- III. DESIGN AND DEVELOPMENT
power embedded system with ultrasonic sensors, vibration A. Design Concepts
motors, and a power supply. The NavGuide provides audio This section demonstrates the hardware components and
feedback to the user using headphones [11]. the design and development procedure of the assistive
The systems based on ultrasound technology are relatively navigation device. Our suggested platform is lightweight, easy
low-cost, but the precision is lower than IR-based systems to use, and low-cost. To develop a 3D interaction with the
owing to the reflection influence. Additionally, this kind of environment, we introduce a system consisting of two
system is always associated with other technologies, which devices: an assistive glove for object recognition and a shoe
may increase the cost of the system [12]. Furthermore, tem- for autonomous navigation.
perature, humidity, and high-frequency sounds can affect the After defining the design concept, we developed the assis-
measurements. tive glove for object recognition. The glove’s primary function
is to identify passive RFID-tagged objects, relate audio record-
C. GPS-based methods
ing messages to their unique IDs, and help the user navigate.
The GPS-based techniques are commonly used in outdoor After testing the glove in different experiments and sce-
navigation [13] since the GPS signal is weakened inside the narios [22], we decided to design a system for helping the
buildings, and the accuracy is usually less than 15 meters visually impaired in indoor navigation. Similarly, for the shoe,
[14]. Vela´zquez et al. [15] designed a system consisting of we defined a database of a room on the Raspberry Pi. Then
a smartphone, a microcontroller, and an RF module. The data we covered the room with RFID tags and embedded an RFID
is transmitted to the cloud server via an internet connection, module under the shoe. The audio instructions were then
and the user receives the foot tactile feedback to find the generated after the user walked on the RFID passive tags
direction. Another research by Ramadhan [16] proposed a to
wearable device including ultrasonic sensors, a microcon- aid the user in navigating.
troller, accelerometers, a GSM module, and a GPS. The shoe and the glove use independent onboard databases
that can be utilized either separately or in an integrated
D. RFID-based methods
mode. Furthermore, the shoe and the glove can communicate
The RFID sensing methods will become increasingly pop- wirelessly through Raspberry Wi-Fi or Bluetooth to a PC or a
ular among researchers in different fields. The RFID sensor smartphone. This configuration allows the user to create
technology will be more utilized in biomedical areas [17]. onboard databases where the required information about
Devipriya et al. [18] proposed an RFID-based smart store the target is saved.
assistor for the visually impaired. This system consisted of
According to Fig. 1, the system comprises two central parts:
three modules: product identifier, smart glove and smart trol-
(1) the RFID tags located in specific locations or on specific
ley. In another work, Meshram et al. [19] provided NaveCane,
objects, and (2) the wearable device worn by the user.
an assistive device for autonomous orientation, to help the
visually impaired.

573
Fig. 3. The proposed prototype of the shoe.

B. Prototype Development
The proposed platforms consist of five main components:
the RFID transponder, the microcontroller, the voice player,
the user interface, and the power supply (Figs. 2 and Fig.
3).This section provides the details of the navigation system’s
Fig. 1. The architecture of the system. Devices consist of RFID prototype.
transponders, the processing units, the voice recorder (and 1) User interaction: Based on the number of available tags
player), the user interface, a power supply (up), and the RFID in the room, the keypad takes multiple digits as input, where
passive tags labeled in specific places or on objects (down). each number represents a unique tag on the floor. By entering
a number and then pressing the hash key, the destination is
set to the corresponding tag.
The star key can be used to restart the program. Aside
from giving the user voice commands towards the
destination, we designated a separate button to play audio
recordings that describe the user’s position. If users require
additional information about their surroundings or nearby
objects, they must press the ”A” key. This kind of audio
information exists only for certain tags that represent
landmarks.
2) The path-finding algorithm: A graph can be formed,
considering the arranged tags on the floor as nodes and
the distances between them as edges. This graph displays
a basic map of the environment. The nodes can also serve
as landmarks, presenting a piece of equipment or a point of
interest. The goal for the user could be to (1) find the shortest
path to their desired destination, (2) avoid fixed obstacles,
and (3) obtain general information about their position and
surroundings. In order to achieve these goals, an onboard
database and the shortest path algorithm are required.
3) The shortest path algorithm: In our design, each node
Fig. 2. The final prototype of the glove. is connected to a maximum of 6 neighboring nodes in a
hexagonal pattern. All tags are equidistant, hence the weights
of the edges are considered the same (equal to 1). This
condition is true unless no edge can be identified due to the
presence of a barrier or an obstacle, or it is comparatively
hard to cross that edge. In such special cases, the weight is
considered 2.

574
The complete graph includes all the vertices, their given peripheral interface (SPI) protocol, I2C, and UART protocols
names, their neighboring vertices, and the weights of the and provides reliable communication by two-way data
edges. An optimization algorithm is needed to approach the transfer at 424 Kbit/s. The transponder is 60 × 30 mm2.
shortest path routing from any starting point to the desired 6) The Processor: To design the shoe, we implement the
target among the vertices in this graph. Fig. 4 explains the system on Arduino Nano 3, Raspberry Pi 3B+, and 4B. The
employed algorithm (based on Dijkstra’s algorithm). Arduino platform is powered by an Atmega328 processor,
It is noted that we cannot give proper instructions if which works at 16 MHz and 32 KB Flash Memory. We first
we do not know the user’s orientation. The algorithm implemented the system on the Arduino Nano3 to determine
considers two consecutive tags to identify the orientation. In the defects. We needed a higher speed for processing the
addition, there is no need to check whether the user has large tags up to 1000 numbers and appropriate storage for
followed the instructions correctly, since Dijkstra’s algorithm our audio files and databases.
calculates a new path regardless of the user’s previous step. Compared with Arduino Nano, Raspberry Pi 3B+ and 4B
have faster 1.4 GHz and 1.5 GHz 64-bit quad-core processors.
They also rely on removable micro SD cards, so they are
suitable choices for our system. Since Raspberry Pi 4B and
3B+ approximately had the same power consumption, we
selected Raspberry Pi 4B.
7) The Audio Unit: The user interacts with the system
through audio feedback. The user is guided to navigate by
playing audio messages while wearing the shoe and scanning
RFID tags placed on the floor.
We first selected the WTV020-SD-16P sound module for
the audio unit. After analyzing the system’s performance, we
added a USB sound card adapter and a microphone to the
Raspberry Pi. This configuration enabled the proposed glove
to record or play the audio messages. For the smart shoe,
the audio instructions are played and received by a hands-
free connection to the 3.5 mm audio jack.
8) The Power Supply: This study uses a power bank with
a capacity of 10000 mAh to power the system.

IV. EVALUATION AND EXPERIMENTAL RESULTS


A simple task was designed to evaluate the shoe’s usability
and performance. The details are as follows.

A. Experimental Setup
A 4.1 × 3.2 m2 room was considered to build the ex-
Fig. 4. The algorithm starts by entering the target tag perimental setup. This room was supposed to represent a
number. After scanning two consecutive tags upon starting physician’s office with two distinct areas: a reception area
the program, the Dijkstra calculates the user’s orientation. and a waiting room (including a coffee table, a water
Then, the tags are scanned one by one as the user is given dispenser, and a vending machine). Twenty-four passive RFID
audio instructions. An interrupt button can restart the tags with unique IDs covered the whole area. Fig. 6 illustrates
program, and another describes landmarks to the user. the experimental setup. The tag placement enabled the user
to avoid obstacles.
4) Database: Storing and accessing such a comprehensive The tag placement was intended to help the users avoid
graph requires a database available at all times without any obstacles. Therefore, the tags were placed in the vicinity of
wireless connection. The database includes all the audio in- the obstacles, such as the coffee table. The location of the ob-
structions and the characteristics of the room’s plan, such as stacles was initially registered on the onboard database
the arrangement of the tags, their position concerning each (hence, no extra sensors were required). The system could
other, and the distances between them. The database was provide two types of audio feedback. The first was the
implemented using SQLite library in python. The SQLite navigation instruc- tions, and the second one was the
library offers practical features such as expanding the landmark announcement. Table I and Table II show the audio
database. instructions. The audio cues were chosen to be brief and easy
5) The RFID Transponder: The RFID module is the critical to follow. Fig. 5 shows the evaluation of the device in the real
component. We were required to use an easy-to-use, environment.
affordable module. In this study, we used passive RFID tags
with ISO 14443A standard and 13.56 MHz frequency and
a 13.56 MHz MF-RC522 module. This transponder
supports serial
575
B. Experimental Results
According to the experiment results, the RFID module’s
configuraton offers a detection range of nearly 5 cm with an
effective angle of 60°. The maximum gain for the antenna is
5.5 dBi at a perpendicular angle. With multiple tags in the
detection range, the transponder scans the most direct one.
Also, nearly 30 tags can fit on a 10 m2 surface with our
proposed configuration. The tags are the same size as a credit
card and can be covered by plaster or rug. The coverage does
not significantly affect the reader’s detection range. After the
user scans a tag, it roughly takes up to 1.5 seconds to receive
the audio instructions. For instance, the average time
traveling from point ”A” to point ”Q” was 82 seconds.
The Raspberry Pi 4B has a peripheral current draw of
575mA (2.85 W) at idle and 600mA (3 W) while using
one core. Meanwhile, the energy consumed by an RFID
transponder is over 32 mA while reading passive tags. During
the active state, when the device is scanning tags and playing
audio recordings, the current draw adds up to over 650 mA
(3.25 W), while at idle, it stays at a minimum of 575 mA.
Given the 60% idle time average, the overall daily energy
consumption is about 72.3 Wh. The device could be functional
for 8 hours with a 10000 mAh power supply. Although energy
efficiency was not our primary focus, the system seems power-
efficient for intended applications. Table III summarizes the
Fig. 5. The evaluation of the device. experimental results.
C. Limitations
TABLE I The relatively large distance between the tags was a limita-
tion in this work. However, increasing the number of tags leads
ADUIO FEEDBACK FOR LANDMARKS
to more nodes, higher computational complexities, higher
feedback delays, and slower walking speed. Such factors may
Audio Feedback Node cause difficulties for the user. Also, this work did not test the
construction of a 3D tag placement and involvement of the
Office number 1 is located glove in navigation (3D interaction with the environment).
here. A This is the women’s
V. CONCLUSION AND FUTURE WORK
bathroom. C This is the
men’s bathroom. D We presented the design details of a wearable device to
This is the doctor’s office. I assist people with visual impairment in indoor navigating
This is office number 2. J in structured environments. The device showed efficiency in
There is a coffee table around. the designed experiment. The characteristics of the proposed
M system are listed as follows: 1) This system is self-supporting
This is the waiting area. N and is not dependent on any devices such as smartphones or
This is the reception table. Q
There is a round table here. R the Internet, 2) due to its simple structure, the platform is
This is the vending machine. V relatively affordable for the user and, thus, can be available
The receptionist is here. X to the visually impaired community, 3) this device can be
personalized according to the user’s requirements.
The system’s reliability and accuracy will be evaluated as
TABLE II part of our future work in different scenarios with multiple
NAVIGATION ADUIO INSTRUCTIONS visually impaired users. We also plan to improve the user
interface by adding a Braille keypad to the system.A smaller
Raspberry version, Raspberry Zero, and flexible boards could
Audio Instruction Direction also be employed.We also noted that the system must be water-
Walk straight ahead. North
Turn to your 2 o’clock and keep walking slowly. North-East resistant in some applications.
Turn to your 10 o’clock and keep walking slowly. North-West
Make U-turn South
Turn to your 4 o’clock and keep walking slowly. South-East
Turn to your 8 o’clock and keep walking slowly. South-West

576
TABLE III [7] R. Jafri, “A GPU-accelerated real-time contextual
SUMARRY OF EXPERIMENTAL RESULTS awareness application for the visually impaired on
Name Result Google’s project Tango device”, The Journal of
Supercomputing, vol. 73, no. 2, pp. 887-899, 2017.
[8] B. Jiang, J. Yang, Z. Lv and H. Song, “Wearable Vision
RFID maximum detection Range and gain 5 cm - 5.5 dBi
Assistance System Based on Binocular Sensors for
Average delay time for receiving instructions 1.5 seconds
Visually Impaired Users,” IEEE Internet of Things Journal,
Average travelling velocity from ”A” to ”Q” 5.46 cm/s
Overall daily energy consumption 72.3 Wh vol. 6, no. 2, pp. 1375-1383, 2019.
[9] A. Mancini, E. Frontoni and P. Zingaretti, “Mechatronic
System to Help Visually Impaired Users During Walking
and Running”, IEEE Transactions on Intelligent
Transportation Systems, vol. 19, no. 2, p. 649–660, 2018.
[10] N. Saeed, H. Nam, T. Y. Al-Naffouri and M. Alouini, “A State-
of-the-Art Survey on Multidimensional Scaling-Based
Localization Techniques,” IEEE Communications Surveys &
Tutorials, vol. 21, no. 4, pp. 3565- 3583, 2019.
[11] K. Patil, Q. Jawadwala, and F. C. Shu, “Design and
Construction of Electronic Aid for Visually Impaired
People,” IEEE Transactions on Human-Machine Systems,
vol. 48, no. 2, pp. 172–182, Apr. 2018.
[12] Z. Da, F. Xia, Z. Yang, L. Yao and W. Zhao, “Localization
technologies for indoor human tracking,” in IEEE 5th
international conference on future information
technology, 2010.
[13] R. Tapu, B. Mocanu and T. Zaharia, “Wearable assistive
devices for visually impaired: a state-of-the-art survey,”
Pattern Recognition Letters, vol. 137, pp. 37-52, 2018.
[14] R. Dhod, G. Singh, G. Singh and M. Kaur, “Low Cost
GPS and GSM Based Navigational Aid for Visually
Fig. 6. The test room. The markers (landmarks) are shown Impaired People,” Wireless Personal Communications,
in this room with red tags. The dashed line represents the vol. 92, no. 4, pp. 1575-1589, 2017.
possibility of passing that path. All tags have the same [15] R. Vela´zquez, E. Pissaloux, P. Rodrigo, M. Carrasco, N. I.
distance with one neighboring tag. Giannoccaro and A. Lay-Ekuakille, “An Outdoor
Navigation System for Blind Pedes- trians Using GPS and
Tactile-Foot Feedback,” Applied Sciences, vol. 8, no. 4, p.
578, 2018.
REFERENCES [16] A. J. Ramadhan, “Wearable Smart System for Visually
Impaired People,”
[1] World Health Organization, “Blindness and Vision Sensors, vol. 18, no. 3, p. 843, 2018.
Impairment,” Who.int, Oct. 11, 2021. [17] L. Cui, Z. Zhang, N. Gao, Z. Meng, and Z. Li, ”Radio
https://www.who.int/news-room/fact- Frequency Iden- tification and Sensing Techniques and
sheets/detail/blindness-and-visual-impairment. Their Applications—A Review of the State-of-the-Art,”
[2] S. Bhatlawande, M. Mahadevappa, J. Mukherjee, M. Sensors, vol. 19, no. 18, p. 4012, Sep. 2019.
Biswas, D. Das, and S. Gupta, “Design, development, and [18] D. Devipriya, V. S. Sri, and I. Mamatha, “Store Assistor for
clinical evaluation of the electronic mobility cane for Visually Impaired,”2018 International Conference on
vision rehabilitation,” IEEE Trans. Neural Syst. Rehabil. Advances in Computing, Communications and Informatics
Eng., vol. 22, no. 6, pp. 1148–1159, 2014. (ICACCI), Sep. 2018.
[3] F. Barontini, M. G. Catalano, L. Pallottino, B. Leporini and [19] V. V. Meshram, K. Patil, V. A. Meshram, and F. C. Shu, “An
M. Bianchi, “Integrating Wearable Haptics and Obstacle Astute Assistive Device for Mobility and Object
Avoidance for the Visually Impaired in Indoor Recognition for Visually Impaired People,”IEEE
Navigation: A User-Centered Approach,” IEEE Transactions on Human-Machine Systems, vol. 49, no. 5,
Transactions on Haptics, vol. PP, no. 99, p. 1, 2020. pp. 449–460, Oct. 2019.
[4] J. Bai, Z. Liu, Y. Lin, Y. Li, S. Lian and D. Liu, “Wearable [20] B. Ando, S. Baglio, V. Marletta, R. Crispino, and A. Pistorio,
Travel Aid for Environment Perception and Navigation “A Measurement Strategy to Assess the Optimal Design
of Visually Impaired People,” Electronics, vol. 8, no. 6, p. of an RFID- Based Navigation Aid,”IEEE Transactions on
697, 2019. Instrumentation and Measure- ment, vol. 68, no. 7, pp.
[5] A. Aladre´n, G. Lo´pez-Nicola´s, L. Puig and J. J. Guerrero, 2356–2362, Jul. 2019.
“Navigation Assistance for the Visually Impaired Using [21] S. Alghamdi, R. v. Schyndel and I. Khalil, “Accurate
RGB-D Sensor With Range Expansion,” IEEE Systems positioning using long range active RFID technology to
Journal, vol. 10, no. 3, pp. 922-932, 2016. assist visually impaired people,” Journal of Network and
[6] L. B. Neto, F. Grijalva, V. R. M. L. Maike, L. C. Martini, D. Computer Applications, vol. 41, pp. 135-147, 2014.
Florencio,
[22] P. Sedighi, M. H. Norouzi and M. Delrobaei, “An RFID-
M. C. C. Baranauskas, A. Rocha and S. Goldenstein, “A
Based Assistive Glove to Help the Visually Impaired,”
Kinect-Based Wearable Face Recognition System to Aid
IEEE Transactions on Instrumen- tation and
Visually Impaired Users”, IEEE Transactions on Human-
Measurement, vol. 70, pp. 1-9, March 2021
Machine Systems, vol. 47, no. 1, pp. 52- 64, 2017.
577

You might also like