0% found this document useful (0 votes)
23 views10 pages

Concept of Using The Brain-Computer Interface To Control Hand Prosthesis

Uploaded by

Hayder Adnan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views10 pages

Concept of Using The Brain-Computer Interface To Control Hand Prosthesis

Uploaded by

Hayder Adnan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

VOLUME 16, N° 4 2022

Journal of Automation, Mobile Robotics and Intelligent Systems

Concept of Using the Brain-Computer Interface


to Control Hand Prosthesis
Submitted: 8th November 2022 ; accepted 7th February 2023

Julia Żaba and Szczepan Paszkiel

DOI: 10.14313/JAMRIS/4-2022/27 BCI can improve the quality of life for people with
severe motor disabilities. BCI captures the user’s
brain activity and translates it into commands that
Abstract: control an effector such as a computer cursor, a
This study examines the possibility of implementing robotic limb, or a functional electrical stimulation
intelligent artificial limbs for patients after injuries or device [6]. BCI has many applications, such as in
amputations. Brain-computer technology allows signals medicine. RuiNa et al. [7] in their paper presented
to be acquired and sent between the brain and an ex- the control of an electric wheelchair using BCI. In
ternal device. Upper limb prostheses, however, are quite their design they used visual evoked potentials:
a complicated tool, because the hand itself has a very SSVEP. The wheelchair consists of a hybrid visual
complex structure and consists of several joints. The stimulator that combines the advantages of liq-
most complicated joint is undoubtedly the saddle joint, uid crystal display (LCD) and light emitting diodes
which is located at the base of the thumb. You need to (LED). M. Vilela and L. R. Hochberg [6] described
demonstrate adequate anatomical knowledge to con- new developments to improve the user experience
struct a prosthesis that will be easy to use and resemble of BCI with effector robots. Full efficient manipula-
a human hand as much as possible. It is also important tion of robots and prosthetic arms via a BCI system
to create the right control system with the right software is challenging due to the inherent need to decode
that will easily work together with the brain-computer multidimensional and preferably real-time control
interface. Therefore, the proposed solution in this work commands from the user’s neural activity. Such func-
consists of three parts, which are: the Emotiv EPOC + tionality is fundamental if BCI-controlled robotic or
Neuroheadsets, a control system made of a servo and prosthetic limbs are to be used for daily activities.
an Arduino UNO board (with dedicated software), and BCI also has applications in rehabilitation, such as
a hand prosthesis model made in the three-dimensional BCI-controlled robots. They are designed for motor
graphic program Blender and printed using a 3D printer. assistance to help paralyzed patients to improve up-
Such a hand prosthesis controlled by a signal from the per and lower limb mobility [8]. Different algorithms
brain could help people with disabilities after amputa- are used to classify brain signals.
tions and people who have damaged innervation at the Channel selection is a key topic in BCI. Imagining
stump site. hand movement is a frequently used component of
the learning data set for algorithms. For example,
Keywords: BCI, EEG, hand prosthesis, hand, prosthesis, Milanović [9] used a sequence of 70 tasks involving
3D printing alternating imagining a right-hand movement and a
resting hand movement. S. Soman and B. K. Murthy
[10] created a design based on a BCI system for gene-
1. Introduction rating synthesized speech that operates on a blinking
Brain testing uses several methods, one of which eye detected from the user’s electroencephalogram
is the measurement of brain waves. These brain signals. Khan et al. [11] developed a broad overview
waves can be collected in the form of electrical sig- of the applications of BCI interfaces in the context of
nals. The acquisition of brain signals can be done the upper extremity. Gubert et al. [12] analyzed left-
invasively and non-invasively. The invasive method and right-hand motion imagery. They used publicly
involves placing sensors inside the scalp, but this is available databases and the CSP (Common Spatial
a risky course of action. The other method is non- Patterns) algorithm. Hernández-Del-Toro et al. [13]
invasive, and the sensors are implanted above the used the Emotiv EPOC interface. As a test sequence,
skin. However, this method is noisy, making it dif- they used a set of repetitions of imagined words spo-
ficult to extract useful information. The connection ken in Spanish (up, down, left, right, choice) repeated
between the brain and an external device is called randomly 100 times each by 27 individuals. Fourteen
the brain-computer interface (BCI) [1-3]. Currently, EEG channels were used; the sampling rate was 128
the most popular data source for BCI is EEG sig- Hz. Discrete wavelets transform (DWT) and fractal
nals from surface brain activity. This is because methods, among others, were used to analyze the
these types of measurement are non-invasive [4, 5]. signals. The nearest neighbor method (decision tree

2022 ® Żaba and Paszkiel. This is an open access article licensed under the Creative Commons Attribution-Attribution 4.0 International (CC BY 4.0) 3
(https://creativecommons.org/licenses/by-nc-nd/4.0)
Journal of Automation, Mobile Robotics and Intelligent Systems VOLUME 16, N° 4 2022

method) and support vector machine (SVM) were 2. Examples of Implementation Concept in
used for classification, among other tools. the Field of Artificial Hand
Task irrelevant and redundant channels used This article describes the concepts of a proprietary
in BCI can lead to low classification accuracy, BCI-controlled hand prosthesis. The hand prosthesis
high computational complexity, and application controlled by the signal from the brain enables people
inconvenience. By choosing optimal channels, the with disabilities without a hand or after amputations,
performance of BCI can improve significantly. B. Shi and people with damaged innervation at the stump
et al. [14] in their paper proposed a novel binary site. This solution uses a non-invasive method, so peo-
harmony search (BHS) to select optimal channel sets ple who are not entirely convinced of this method can
and to optimize the accuracy of the BCI system. BHS test whether it suits them without interfering with
is implemented on learning datasets to select opti- their body. The main goals are to select an EEG device,
mal channels, and test datasets are used to evaluate design and construct a prototype of a hand prosthesis,
the classification performance on the selected chan- select and program an appropriate control system.
nels. The authors proposed a BHS method for selec- A prosthesis is a tool that supports or replaces an
ting optimal channels in MI-based BCI. Their results amputee in carrying out their daily tasks. Instead of
validate the BHS algorithm as a channel selection passive devices that are purely aesthetic, the current
method for motor imaging data. The BHS method, devices have im-proved functionality using robotic
costing less computation time, gives better average technology. M. A. Abu Kasim et al. [17] presented the-
test accuracy than steady-state genetic algorithms. ir conceptual idea to use a non-invasive Emotiv head-
The proposed method can improve the practicality set to control a prosthetic hand using LabVIEW. This
and convenience of BCI system. design is intended for the use of cost-effective upper
F. M. Noori et al. [11] proposed a new technique limb prostheses controlled by signal artifacts and
for determining optimal feature combinations and uses facial expressions. This device can be used and
obtaining maximum classification performance for controlled by paralyzed persons with limited com-
BCI-based functional near-infrared spectroscopy munication skills via a graphical user interface (GUI).
(fNIRS). The results of the proposed hybrid GA-SVM It is worth noting that the non-invasive BCI method
technique, by selecting the optimal feature combina- was used to create the project. The GUI is created with
tions for fNIRS-based BCI, provide opportunities to LabVIEW software connected to the Ar-duino board
enhance classification performance. via a serial USB data connection.
Janani et al. [15] applied a deep learning neural The use of body-powered prostheses can be
network algorithm to classify motion imagery based tiring and lead to further compliance and prosthetic
on infrared signal. Functional near-infrared spec- problems. BCI makes it possible to inspect dentures
troscopy (fNIRS) was used, in which infrared light for patients who are otherwise unable to operate such
passes through a hemodynamic system. The pheno- devices due to physical limitations. The problem with
menon of change in absorption of infrared radiation BCIs is that they usually require invasive logging me-
depending on the wavelength of radiation was used. thods where surgery needs to be performed. G. Lange
The principle of operation is like the blood oxygen et al. [18] presented a study to test the ability to control
saturation meter. the movement of an upper limb prosthetic terminal
BCI will also find application in neuro-prosthetics. device by classifying electroencephalogram data from
Neuroprosthetics is a combination of neuroscien- the actual grasping and releasing movement. Thus,
ce and biomedical engineering. Implantable devices they developed a novel EMG-assisted approach to clas-
can significantly improve quality of life due to their sifying EEG data from hand movements. This demon-
unique performance. The combination of biomedical strates the possibility of a more intuitive control of the
engineering and neuro-prosthetics has led to the de- prosthetic end device of the upper limb with a low-cost
velopment of new hybrid biomaterials that meet the BCI without the risk of invasive measurement.
needs of ideal neuroprosthetics. The site of implanta- R. Alazrai, H. Alwanni, M. I. Daoud [19] described
tion of the prosthesis determines the type of material a new EEG-based BCI system that they used to deco-
and method of fabrication. P. Zarrintaj et al. [16] in de the movements of each finger in the same hand. It
their article described the types of biomaterials used is based on the analysis of EEG signals using the qu-
for bionic neuroprostheses. The diversity of neuro- adratic time frequency distribution (QTFD), or Choi-
prosthetics necessitates the use of a wide range of William distribution (CWD). In particular, CWD is used
materials from organic to inorganic. However, using to characterize the various components over time of
only metals, due to incompatibility with soft tissues, spectral EEG signals and to extract functions that can
can cause inflammation. Metal-polymer hybrids can capture motion-related information. The extracted
reduce the disproportion between soft tissues and CWD-based functions are used to create a two-tier clas-
electrodes, where the polymer part can regulate the sification structure that decodes the finger movements
modulus of the metal. Moreover, different types of in the same hand.
electrodes should be selected for different types of J. E. Downey, J. Brooks, S. J. Bensmaia [20] descri-
signal recording. Therefore, the selection of bioma- bed technologies designed to sense the state of the
terials for neuroprostheses is crucial and requires hand and contact with objects and connect with the
knowledge of the electrode implantation site and ma- peripheral and central nervous systems. The skil-
terial characteristics. lful manipulation of objects is based not only on a

4
Journal of Automation, Mobile Robotics and Intelligent Systems VOLUME 16, N° 4 2022

sophisticated motor system that moves the arm and 3D-printed prosthetic arm. The arm is controlled by
hand, but also on the accumulation of sensory signals brain commands received from the headset via an
that convey information about the consequences of EEG. The arm is equipped with a network of intelligent
these movements. The development of a skillful bio- sensors and actuators. This smart network provides
nic hand therefore requires the restoration of both the arm with normal hand functionality and smooth
control and sensory signals. It is important that the movements. The arm has different types of sensors,
bionic hand is well constructed and allows for fre- including temperature sensors, pressure sensors,
edom of movement: to do this you need to properly ultrasonic proximity sensors, accelerometers, poten-
attach the sensors. Research aims to create artificial tiometers, strain gauges, and gyroscopes. EEG signals
sensory feedback through electrical nerve stimulation are recorded using the Emotiv EPOC wireless head-
in amputees or electrical brain stimulation in tetra- set. The EEG signals provided by the input unit are
plegic patients. While artificial sensory feedback, still sampled and processed by the processing unit. The
in its early stages, is already giving bionic hands more arm is equipped with a special servo and an Arduino
dexterity, ongoing research to make artificial limbs microcontroller, which ensures an appropriate inter-
more natural offers hope for further improvements. face between the mechanical and processing units.
Guger et al. [21] presented a system that uses EEG Multiple sensors allow the arm to interact with and
for hand prosthesis control. The digital input / output adapt to the surrounding environment and to com-
channels are used to control a remote control that is mand the arm and provide feedback to the patient.
connected to a microcontroller to control the prosthe- In Constantine et al.’s application [24], they used
sis. The microcontroller receives commands from the a comprehensive model structure, from feature
remote control and regulates the speed of the grip. construction to classification, using a technological
The technique of imagining the movement was used to neural network. The process of starting from the be-
control the hand. After the appropriate beep was he- ginning meant that the initial solution of the team was
ard, the user had to imagine the movement of his left or put together by the tools, starting from the beginning
right hand depending on the arrow that was displayed of the initial instantiation of the computer solution
on the monitor. It all took a few seconds, and then for (CCI). The proposed architecture is complemented by
the next time the EEG signal was properly classified the design and implementation of a hand prosthesis
and used to control the prosthesis. One session requ- with Google Degree of Freedom (DOF). This incor-
ired the authors to make as many as 160 attempts. The porates a Field Programmable Gate (FPGA) that co-
authors performed three sessions. The operation of nverts electroencephalographic (EEG) AR gates into
the system is based on the BCI software and hardware. prosthetic movement. They also proposed a new
Matlab Simulink is used to calculate various parame- subject selection and grouping technique that is ava-
ters that describe the current EEG state in real time. ilable with the subject’s motor intentions. The model
Matlab also supports data acquisition, synchroniza- implemented with the proposed architecture showed
tion, and presentation of the experimental paradigm. a successful pattern of 93.7% and a classification time
In their article, G. R. Müller-Putz and G. Pfurtscheller of 8.8 years for FPGA. Their implementation allows
[22] presented a prototype of a two-axis electric hand the application of BCI for the technique used in FPGA
prosthesis control, which uses an asynchronous four- practice.
-class BCI based on static and visual evoked poten- In their article, J. W. Sensinger, W. Hill, and
tials (SSVEP). The authors constructed a stimulation M. Sybring explore the many aspects that influence the
device. For the experiment with the prosthetic device, ability of an upper limb prosthesis to affect a person’s
they modified the prosthesis of the hand in such a way daily life. They argue that these influences can be cate-
that, in addition to the gripping function (opening and gorized into four domains: aspects intrinsic to the per-
closing the fingers), it was also possible to rotate the son; factors focused on the design, control, and sensory
wrist (left and right). Four red LEDs are mounted at feedback of the prosthesis; facets external to the per-
specific locations on the armature. The authors used son; and outcome measures used to evaluate devices,
four healthy participants for their research. They per- activities, or quality of life. The purpose of a prosthetic
formed four sessions of 40 attempts, and the parti- device is to improve a person’s quality of life [25].
cipants had to follow the instructions given to them
by a beep. Users also had to focus on the appropriate
flashing lights attached to the prosthesis to trigger the 3. Materials and Methods
appropriate prosthetic action. The LED lights were The methodology has three stages: acquisition of EEG
not attached accidentally. Each was attached precise- data from the selected BCI device, design and printing
ly to make the right movement: one LED on the index of a 3D-printed prosthesis hand, and programming of
finger to turn right, and one LED on the fifth finger to the control system. The EEG signal acquisition device
turn left. There were also two LEDs attached to the is the Emotiv EPOC+ NeuroHeadset and has the fol-
forearm. The first lamp was used to open the hand, lowing specifications [26]:
and the second to close it. The authors proved that an - 14 recording electrodes and 2 reference elec-
SSVEP-based BCI, operating in asynchronous mode, is trodes, offering optimal positioning for accura-
feasible for the control of neuroprosthetic devices. te spatial resolution.
In this article, T. Beyrouthy et al. [23] presented a - the channel names based on the international
preliminary design of a mind-controlled, intelligent, 10-20 electrode location system are: AF3, F7,

5
Journal of Automation, Mobile Robotics and Intelligent Systems VOLUME 16, N° 4 2022

F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4, The device that we chose to acquire EEG signals is the
with CMS / DRL references at locations P3 / P4. Emotiv EPOC+ NeuroHeadset. It allows communication
- uses sequential sampling method, single ADC, with a computer based on brain activity, facial muscle
at 256 SPS (2048 Hz internal) - sample rate per tension, and emotions. It has 14 recording electrodes
second. and 2 reference electrodes. This amount is sufficient in
- operates at 16-bit resolution per channel with a this case. It connects wirelessly to the computer and mo-
frequency response of 0.16 - 43 Hz. bile devices and has 9-axis motion sensors. It stands out
- supports Bluetooth Smart 4.0 LE. for its long working time (up to 12 hours). The device
- has high resolution (14-16 bit) sets up quickly. It is also important to remember to prop-
- typical operating time of the device from a full erly moisten the reference sensors with saline solution
charge is 12 hours. so that signal reception occurs properly.
In the box of the EPOC+ Headset (Fig. 1) are:
The control system is a microcontroller (Arduino - Brain-Computer Interface with built-in lithium
UNO) and servo (Feetech servo FT6335M standard). battery,
Arduino is an open-source electronics platform based - universal USB receiver,
on easy-to-use hardware and software. The Arduino - humidifier packet,
UNO is a microcontroller board that has 14 digital - saline solution,
input/output pins (6 of which can be used as PWM - USB charger with Mini-B connector,
outputs), 6 analog inputs, a 16 MHz ceramic resona- - quick start guide.
tor (CSTCE16M0V53-R0), a USB connector, a power
jack, an ICSP jack, and a reset button. It is based on the 4.2. Expressiv Suite Functions
ATmega328P. It is a low-power, 8-bit CMOS microcon- The Expressiv Suite app in the Emotiv Control Panel
troller type based on AVR® with an enhanced RISC features an avatar that mimics facial expressions and
architecture. By executing instructions in a single shows teeth clenching, left and right eye movements
clock cycle, the device achieves processor throughput (Fig. 2), eye blinking, left or right eye blinking, eye-
approaching one million instructions per second per brow raising and smiling.
megahertz, optimizing power consumption compared
to processing speed. The Arduino UNO board is 5V. It
has 32 kB of Flash memory, 2 kB of RAM, 14 digital
I/Os of which 6 can be used as PWM channels, 6 analog
inputs, and popular communication interfaces [27].
The prosthetic hand model was designed in Blender.
Blender is a free and open-source 3D modeling soft-
ware. It was developed by NeoGeo but has been de-
veloped by the Blender Foundation since 2002. From
the beginning, Blender’s main programmer was Ton
Roosendaal. It is available for various hardware and
software platforms, including Microsoft Windows,
macOS, and many others. The program caters to all
the needs of 3D graphic designers. It can model, ani-
mate, simulate, render, compose and track motion,
edit video, and create 2D and 3D animation [28].

4. Results Fig. 1. Basic components of Emotiv EPOC+ Neuroheadset


It is reasonable to assume that as a result of the loss
of the hand (no hand), the brachial plexus is not func-
tioning or may be damaged. This is a bundle of nerve
fibers running from the spine all the way to the hand.
It is important for the patient with the artificial hand
to be able to control the prosthetic hand indepen-
dently with the help of EEG signals. The task of such
a prosthesis will be therefore the ability to execute
the commands in correlation with Emotiv EPOC+
NeuroHeadset device. This solution will allow the
patient to fully control his hand even if the nerves in
the amputated limb are not fully functional.

4.1. EEG Signal Acquisition Device


In the global market, there are many companies produc-
ing Brain-Computer Interface devices. However, two Fig. 2. Screenshot of the application Expressiv Suite
companies play a key role: Emotiv Systems and NeuroSky. during looking right

6
Journal of Automation, Mobile Robotics and Intelligent Systems VOLUME 16, N° 4 2022

In this app, there is a control panel next to the


avatar that allows you to adjust the sensitivity with
sliders. For each facial expression, you can check its
effectiveness. If the Expressiv Suite app does not
respond easily to a particular facial expression, use
the slider to increase the sensitivity. If the stimulus
is triggered too easily, causing an unwanted result,
then use the slider to decrease the sensitivity. You
can increase or decrease the sensitivity by moving
the sensitivity slider to the right or left respective-
ly. Each of the seven types of facial expressions can
also be assigned any action in the form of calling any
combination of keys or mouse buttons. This makes
it possible to operate applications, play games or Fig. 3. Screenshot of the application Expressiv Suite
control a device such as a wheelchair or prosthesis with used EmoKey for clenching teeth
using facial expressions. The EmoKey is used for this
(Fig. 3). Next to each slider is a key button, which
is used to configure facial expressions for EmoKey.
EmoKey combines Emotiv’s technology with appli-
cations, converting detected events into any com-
bination of keystrokes. EmoKey runs in the back-
ground but is safe for your device and allows you to
create mappings. EmoKey’s mappings are relatively
simple, like linking the detection of teeth clenching
to a mouse key press, for example. The app then
immediately captures the moment when the user
clenches their teeth. To configure facial expressions
for EmoKey, you need to select the appropriate
expression you want to link and click the Key but-
ton next to the description of, for example, clench
teeth, which will bring up a configuration dialog.
You can also set the facial expression to be continu- Fig. 4. Index finger design - side and top view (the red
ous by selecting Hold in the key box. There are also circle marks the hooks to which thin lines resembling
options for further configurations, such as key hold tendons are attached)
time and key trigger delay; using these, only actions
to which key presses are assigned are sent to the
active application window. Some expressions have The saddle joint of the thumb is too complicated,
the option “occurs” and others have “is equal to,” so it was replaced by a hinge joint in the hand model.
“is greater than,” “is less than.” For example, when The thumb consists of 2 parts (Fig. 5). The latter part
you type “0.3” in the condition field it will cause of the thumb connects immediately to the metacar-
clench teeth to be shown when a clench greater pus, as in the other fingers. In addition, the thumb, so
than 30% of full scale is detected. You can also that it can replicate the behavior of the human hand,
manage and save Emokey mappings using the has been placed at an angle.
EmoKey menu at the top of the Control Panel win- The largest part of the hand and the prosthesis is
dow. Mappings can be loaded or saved and can be the metacarpus (Fig. 6). It has a special depression at
suspended. the bottom. At the top are parts that are supposed to
reflect the tendons.
The hand prosthesis resembles a human hand in
4.3. Methods of Prosthesis Design appearance. However, its mobility is much less, as it
The hand prosthesis was modeled in Blender. Many has only 11 degrees of freedom and includes 9 mo-
features of the rich software were used to create the vable joints. For the purpose of this project, however,
hand. Among others, scaling, extrude function and this amount is sufficient. The final design of the hand
Bevel function were used. Two solids were used to prosthesis is shown in Fig. 7.
create the prosthesis: a cylinder and a cube. To en-
able the hand to have the right proportions Rotate 4.4. Final Model of Hand Prosthesis
and Move tools were used. The joints were created The entire model consists of 10 parts that were
using cylinders, which were placed and scaled ac- printed on a 3D printer using PLA filament.
cordingly. The holes in the joints were made using PrusaSlicer software was used for 3D printing. The
the Boolean modifier. The fingers in the hand have parts of the prosthesis were printed in two stages
two joints and resemble hinge joints. They consist using the Creality Ender 3 printer. The parts of the
of three parts, but the latter part is part of the meta- fingers were printed together with proper spacing,
carpus (Fig. 4). and the metacarpals were printed separately. The

7
Journal of Automation, Mobile Robotics and Intelligent Systems VOLUME 16, N° 4 2022

Fig. 5. Thumb design

Fig. 6. Metacarpus design

Fig. 8. 3D printed hand prosthesis – before assembly

Fig. 7. Final design of hand prosthesis

metacarpal took the longest time to print: 10 hours.


PLA filament in black and gray was used for printing.
The prototype hand prosthesis consists of 10 parts.
The parts were properly sawn after printing so that
they could fit well. The parts of the prototype pros-
thesis were connected using 3 mm diameter screws.
Figures 8 and 9 show the printed hand before and
after assembly.

4.5. The Signal Transmission to the


Prosthesis Hand
When performing a movement, the user does not
need to make a muscle movement directly, but sim- Fig. 9. 3D printed hand prosthesis – after assembly
ply clenches his teeth or blinks his eye or raises his
eyebrows. In creating an appropriate effective ac- app included with the Emotiv EPOC+ NeuroHeadset
tivity matrix, it is important to differentiate a given hardware, it is possible to identify the facial expres-
facial expression, and the movement should be ap- sions of the user using the device. The application
propriately assigned to a given facial expression. This uses EmoKey to assign the appropriate keys from
provides the opportunity to properly classify the the keyboard (i.e., time in µs of servo rotation) to a
user’s intentions and thus build the executive sys- specific facial expression. The serial port monitor is a
tem. Using Emotiv’s EPOC+ device, the EEG signal is tool available to the Arduino software that allows the
acquired from the patient’s head surface using elec- servo to be controlled. The minimum and maximum
trodes placed on the device. Using the Expressiv Suite servo rotation time in µs is stored for a particular

8
Journal of Automation, Mobile Robotics and Intelligent Systems VOLUME 16, N° 4 2022

on along with the necessary tool — the monitor


of the serial port, to be able to control the servo.
The next step was to properly prepare the Emotiv
EPOC+ NeuroHeadset. After preparing the de-
vice on the computer, the Expressiv Suite app was
selected, to which appropriate servo rotation times
were assigned to the given facial expressions using
Fig. 10. Diagram of signal transmission to the prosthesis EmoKey. Lifting the eyebrows was assigned “2400”
and “0” was assigned to the clenched teeth. The
finger is bent when clamping with the teeth and
when the eyebrow is lifted, the finger is straight-
facial expression. Depending on the particular facial ened. Facial expressions can be adjusted depending
expression, the servo rotation time that was previ- on the user’s preferences; therefore, performance
ously assigned to the particular facial expression is tests were also performed during sideways move-
entered on the serial port monitor. This causes the ment of the eyeballs and blinking. It is important
servo to rotate, for example, by its maximum angle, to concentrate properly when performing a given
which gives the effect of a hand movement. The ac- facial expression. User can trace facial expressions
quisition of the EEG signal from the user’s head to the by looking at the avatar in the Expressiv Suite app.
Expressiv Suite application is based on wireless com- It is also advisable that the user, before attempting
munication using a Bluetooth connection. Receiving to make movements of such a prosthesis, which is
the signal from the computer by the control system, controlled by facial expressions, should practice
for the time being, is done by wire. A schematic of the given facial expressions using the Expressiv
signal acquisition and transmission to the prosthetic Suite application itself. Fig. 11 shows the user dur-
hand is shown in Figure 10. ing the prototype performance test.

4.6. Communication 4.7. Artifacts


In the Expressiv Suite app, the user selects given The most common BCI is based on EEG signals, and
facial expressions to which he assigns specific num- there are a number of interferences during the elec-
bers using the EmoKey. These numbers are the cor- troencephalographic test. Artifacts can be divided
responding rotation time of the servo. Two servo into technical and biological. The sources of interfer-
positions are demonstrated in the project: a 0-de- ence are artifacts introduced by physiological pro-
gree position and a 180-degree position. The 0-de- cesses, i.e., muscle activity, facial expression, heart
gree position corresponds to a time of 0 µs, and the rate, and technical solutions, such as the power grid.
180-degree position corresponds to a time of 2400 Therefore, the signal must be significantly amplified
µs. Table 1 shows the relationships, for example, and must also consider the voltages generated at the
of facial expressions to the finger movement of the skin-electrode interface. After filtering out mains
prosthetic hand. A time of 2400 µs was assigned to frequency interference and performing the filtering
the teeth clench expression and a time of 0 µs was and feature extraction the signal should be clean.
assigned to the raised brow expression. When the The result of these actions will be the expected sig-
user performs a given facial expression, this servo nal properties. Undoubtedly, during diagnostic test-
time is outputted on the serial port monitor, caus- ing of EEG signals, artifacts are eliminated as much
ing it to rotate and move the prosthetic hand. Facial as possible. Normal signal EEG (no artifacts) shows
expressions can be customized to the user’s liking, in Fig. 12.
i.e., instead of a clench teeth, there can be a blink of Facial expressions, as already mentioned, are
the eye or a smile. also among the artifacts, but for the purposes of
using the Expressiv Suite in Emotiv Control Panel
Tab. 1. R
 elationship of facial expression to hand finger artifacts are as desirable as possible. In this appli-
movement cation, therefore, there is a built-in algorithm for
the detection of artifacts, or signal interference.
Facial expressions Servo rotation time Movement
Figures 12-15 show artifacts during different facial
clench teeth 2400 µs finger bends
expressions.
raise brow 0 µs finger bends

4.7. Tests 5. Discussion


The using of the hand prosthesis prototype was Some difficulties were encountered during the pro-
tested for a selected finger and for selected facial totype development. The control problem was that
expressions. Cables were attached to the prototype initially in the concept implementation, the Congnitiv
and to the servo. For the test, the hand prosthe- Suite app in Emotiv Control Panel could be used for
sis was placed in such a position that it could be control. However, this application requires a very high
moved only through brain waves. It was also neces- level of concentration and trained senses to be able to
sary to properly place the servo. Then the program use it freely. Therefore, it was concluded that it would
dedicated to the microcontroller used was turned be better to control using the Expressiv Suite app,

9
Journal of Automation, Mobile Robotics and Intelligent Systems VOLUME 16, N° 4 2022

which is more intuitive and simpler for the user. It is the human hand, it was decided that 9 moving parts
worth noting that, therefore, control by facial expres- would be sufficient for the purpose of this prototype,
sions is significant for this solution. When modeling and the thumb saddle joint, which has too complicat-
the prosthetic hand, instead of using 14 joints as in ed a structure, was replaced by a hinge joint in the
hand model.

Fig. 12. EEG signal – no artifacts

Fig. 11. Test of using hand prosthesis

Fig. 14. Artifact – EEG signal during raising brows


Fig. 13. Artifact – EEG signal during clenching teeth

Fig. 15. Artifact – EEG signal during smiling Fig. 16. Artifact – EEG signal during blinking eyes

10
Journal of Automation, Mobile Robotics and Intelligent Systems VOLUME 16, N° 4 2022

6. Conclusion [7] Na, R.; Hu, C.; Sun, Y.; Wang, S.; Zhang, S.; Han,
This article shows one of the many proposed solu- M.; Yin, W.; Zhang, J.; Chen, X.; Zheng, D. An em-
tions for improving the functioning of people without bedded lightweight SSVEP-BCI electric wheel-
an upper limb. This proposal is to control a prosthetic chair with hybrid stimulator. Digit. Signal
arm using brain waves. The use of such a prosthesis Process. vol. 116, 2021, 103101, doi:10.1016/J.
is very important for disabled people. Such a hand DSP.2021.103101.
prosthesis controlled by facial expressions can help
amputees and people who have damaged innervation [8] Robinson, N.; Mane, R.; Chouhan, T.; Guan, C.
in the stump area. This solution uses a non-invasive Emerging trends in BCI-robotics for motor
method, so people who are not fully convinced by this control and rehabilitation. Curr. Opin. Biomed.
method can test it for themselves without interfering Eng. vol. 20, 2021, 100354, doi:10.1016/J.
with their bodies. COBME.2021.100354.
[9] Miladinović, A.; Ajčević, M.; Jarmolowska, J.;
AUTHORS Marusic, U.; Colussi, M.; Silveri, G.; Battaglini, P.P.;
Julia Żaba – Faculty of Electrical Engineering, Accardo, A. Effect of power feature covariance shift
Automatic Control and Informatics, Opole University on BCI spatial-filtering techniques: A compara-
of Technology, Opole, 45-758, Poland, E-mail: j.zaba. tive study. Comput. Methods Programs Biomed.
[email protected]. vol. 198, 2021, doi:10.1016/j.cmpb.2020.105808.

Szczepan Paszkiel* – Faculty of Electrical [10] Soman, S.; Murthy, B.K. Using Brain Computer
Engineering, Automatic Control and Informatics, Interface for synthesized speech communica-
Opole University of Technology, Opole, 45-758, tion for the physically disabled. In Proceedings
Poland, E-mail: [email protected]. of the Procedia Computer Science; Elsevier B.V.,
vol. 46, 2015; 292–298.
*Corresponding author
[11] Noori, F.M.; Naseer, N.; Qureshi, N.K.; Nazeer,
H.; Khan, R.A. Optimal feature selection from
References fNIRS signals using genetic algorithms for
[1] Ramadan, R.A.; Vasilakos, A. V. Brain computer in- BCI. Neurosci. Lett. vol. 647, 2017, 61–66,
terface: control signals review. Neurocomputing, doi:10.1016/j.neulet.2017.03.013.
vol. 223, 2017, 26–44, doi:10.1016/J. [12] Gubert, P.H.; Costa, M.H.; Silva, C.D.; Trofino-Neto,
NEUCOM.2016.10.024. A. The performance impact of data augmentation
[2] Bernal, S.L.; Celdrán, A.H.; Pérez, G.M. Neuronal in CSP-based motor-imagery systems for BCI ap-
Jamming cyberattack over invasive BCIs af- plications. Biomed. Signal Process. Control, vol.
fecting the resolution of tasks requiring visual 62, 2020, doi:10.1016/j.bspc.2020.102152.
capabilities. Comput. Secur. vol. 112, 2022, [13] Hernández-Del-Toro, T.; Reyes-García, C.A.;
doi:10.1016/J.COSE.2021.102534. Villaseñor-Pineda, L. Toward asynchronous
[3] Shivwanshi, R.R.; Nirala, N. Concept of AI for EEG-based BCI: Detecting imagined words seg-
acquisition and modeling of noninvasive mo- ments in continuous EEG signals. Biomed. Signal
dalities for BCI. Artif. Intell. Brain-Computer Process. Control, vol. 65, 2021, doi:10.1016/j.
Interface, 2022, 121–144, doi:10.1016/ bspc.2020.102351.
B978-0-323-91197-9.00007-2. [14] Shi, B.; Wang, Q.; Yin, S.; Yue, Z.; Huai, Y.; Wang,
[4] Dagdevir, E.; Tokmakci, M. Optimization of pre- J. A binary harmony search algorithm as chan-
processing stage in EEG based BCI systems in nel selection method for motor imagery-based
terms of accuracy and timing cost. Biomed. BCI. Neurocomputing, vol. 443, 2021, 12–25,
Signal Process. Control, 2021, 67, doi:10.1016/j. doi:10.1016/j.neucom.2021.02.051.
bspc.2021.102548. [15] Janani A.; Sasikala M.; Chhabra, H.; Shajil, N.;
[5] Bassi, P.R.A.S.; Rampazzo, W.; Attux, R. Transfer Venkatasubramanian, G. Investigation of deep
learning and SpecAugment applied to SSVEP convolutional neural network for classification
based BCI classification. arXiv 2020, doi:10.1016/j. of motor imagery fNIRS signals for BCI applica-
bspc.2021.102542. tions. Biomed. Signal Process. Control, vol. 62,
2020, 102133, doi:10.1016/j.bspc.2020.102133.
[6] Vilela, M.; Hochberg, L.R. Applications of
brain-computer interfaces to the control of robotic [16] Zarrintaj, P.; Saeb, M.R.; Ramakrishna, S.; Mozafari,
and prosthetic arms. In Handbook of Clinical M. Biomaterials selection for neuroprosthetics.
Neurology; Elsevier B.V., 2020; vol. 168, pp. 87–99. Curr. Opin. Biomed. Eng., vol. 6, 2018, 99–109.

11
Journal of Automation, Mobile Robotics and Intelligent Systems VOLUME 16, N° 4 2022

[17] Kasim, M.A.A.; Low, C.Y.; Ayub, M.A.; Zakaria, BCI. IEEE Trans. Biomed. Eng., vol. 55, 2008,
N.A.C.; Salleh, M.H.M.; Johar, K.; Hamli, H. 361–364, doi:10.1109/TBME.2007.897815.
User-Friendly LabVIEW GUI for Prosthetic
[23] Beyrouthy, T.; Al Kork, S.K.; Korbane, J.A.;
Hand Control Using Emotiv EEG Headset. In Abdulmonem, A. EEG Mind controlled Smart
Proceedings of the Procedia Computer Science; Prosthetic Arm. In Proceedings of the 2016
Elsevier B.V., vol. 105, 2017; 276–281. IEEE International Conference on Emerging
[18] Lange, G.; Low, C.Y.; Johar, K.; Hanapiah, Technologies and Innovative Business Practices
F.A.; Kamaruzaman, F. Classification of for the Transformation of Societies (EmergiTech);
Electroencephalogram Data from Hand Grasp IEEE, 2016; pp. 404–409.
and Release Movements for BCI Controlled [24] Constantine, A.; Asanza, V.; Loayza, F.R.;
Prosthesis. Procedia Technol., vol. 26, 2016, Peláez, E.; Peluffo-Ordóñez, D. BCI System us-
374–381, doi:10.1016/j.protcy.2016.08.048. ing a Novel Processing Technique Based on
[19] Alazrai, R.; Alwanni, H.; Daoud, M.I. EEG-based Electrodes Selection for Hand Prosthesis Control.
BCI system for decoding finger movements with- IFAC-PapersOnLine, vol. 54, 2021, 364–369,
in the same hand. Neurosci. Lett., vol. 698, 2019, doi:10.1016/J.IFACOL.2021.10.283.
113–120, doi:10.1016/j.neulet.2018.12.045. [25] Sensinger, J.W.; Hill, W.; Sybring, M. Prostheses—
[20] Downey, J.E.; Brooks, J.; Bensmaia, S.J. Artificial Assistive Technology—Upper. Encycl. Biomed.
sensory feedback for bionic hands. In Intelligent Eng. vols. 2-3, 2019, 632–644, doi:10.1016/
Biomechatronics in Neurorehabilitation; Elsevier, B978-0-12-801238-3.99912-4.
2019; pp. 131–145 ISBN 9780128149423. [26] EMOTIV Website online: www.emotiv.com
[21] Guger, C.; Harkam, W.; Hertnaes, C.; Pfurtscheller, (accessed on August 2022)
G. Prosthetic Control by an EEG-based Brain- [27] ARDUINO Website online: https://store.ardui-
Computer Interface (BCI). no.cc/ (accessed on August 2022)
[22] Müller-Putz, G.R.; Pfurtscheller, G. Control of [28] BLENDER Website online: www.blender.org
an electrical prosthesis with an SSVEP-based (accessed on August 2022)

12

You might also like