An Introductory Tutorial On Brain-Computer Interfa
An Introductory Tutorial On Brain-Computer Interfa
Review
An Introductory Tutorial on Brain–Computer Interfaces and
Their Applications
Andrea Bonci 1 , Simone Fiori 1, * , Hiroshi Higashi 2 , Toshihisa Tanaka 3,4,5,6 and Federica Verdini 1
1 Dipartimento di Ingegneria dell’Informazione, Università Politecnica delle Marche, Via Brecce Bianche,
I-60131 Ancona, Italy; [email protected] (A.B.); [email protected] (F.V.)
2 Graduate School of Informatics, Kyoto University, Yoshidahonmachi 36-1, Sakyo-ku, Kyoto 606-8501, Japan;
[email protected]
3 Department of Electrical and Electronic Engineering, Tokyo University of Agriculture and Technology,
2-24-16, Nakacho, Koganei-shi, Tokyo 184-8588, Japan; [email protected]
4 RIKEN Center for Brain Science, 2-1, Hirosawa, Wako-shi, Saitama 351-0198, Japan
5 RIKEN Center for Advanced Intelligence Project, 1-4-1 Nihonbashi, Chuo-ku, Tokyo 103-0027, Japan
6 School of Computer Science and Technology, Hangzhou Dianzi University, Xiasha Higher Education Zone,
Hangzhou 310018, China
* Correspondence: [email protected]
Abstract: The prospect and potentiality of interfacing minds with machines has long captured human
imagination. Recent advances in biomedical engineering, computer science, and neuroscience are
making brain–computer interfaces a reality, paving the way to restoring and potentially augment-
ing human physical and mental capabilities. Applications of brain–computer interfaces are being
explored in applications as diverse as security, lie detection, alertness monitoring, gaming, education,
art, and human cognition augmentation. The present tutorial aims to survey the principal features
and challenges of brain–computer interfaces (such as reliable acquisition of brain signals, filtering and
Citation: Bonci, A.; Fiori, S.; processing of the acquired brainwaves, ethical and legal issues related to brain–computer interface
Higashi, H.; Tanaka, T.; Verdini, F. An (BCI), data privacy, and performance assessment) with special emphasis to biomedical engineering
Introductory Tutorial on and automation engineering applications. The content of this paper is aimed at students, researchers,
Brain–Computer Interfaces and Their and practitioners to glimpse the multifaceted world of brain–computer interfacing.
Applications. Electronics 2021, 10, 560.
https://doi.org/10.3390/electronics
Keywords: brain–computer interfacing; automation and control; electrophysiological recordings;
10050560
neuronal oscillations
essence of BCI technology, through sensors placed over the head, is to record such neuronal
oscillations, which encode the brain activity, and to decipher the neuronal oscillation code.
The slow speeds, high error rate, susceptibility to artifacts, and complexity of early
BCI systems have been challenges for implementing workable real-world systems [9].
Originally, the motivation for developing BCIs was to provide severely disabled individuals
with a basic communication system. In recent years, advances in computing and biosensing
technologies improved the outlook for BCI applications, making them promising not only
as assistive technologies but also for mainstream applications [10].
While noninvasive techniques are the most widely used in applications devoted both
to regular consumers and to restoring the functionality of disabled subjects, invasive
implants were initially developed involving experiments with animals and were applied in
the control of artificial prostheses. At the end of the 1990s, implants were applied to humans
to accomplish simple tasks such as moving a screen cursor. The electroencephalogram
(EEG), for instance, is a typical signal used as an input for BCI applications and refers to
the electrical activity recorded through electrodes positioned on the scalp. During the last
decades, EEG-based BCI became one of the most popular noninvasive techniques. The EEG
measures the summation of synchronous activity of neurons [11] that have the same spatial
orientation after an external stimulus is produced. This technique was used to register
different types of neural activities such as evoked responses (ERs), also known as evoked
potentials (EPs) [12,13], or induced responses as event-related potentials, event-related
desynchronisations, and slow cortical potentials [13].
In recent years, the motivation for developing BCIs has been not only to provide an
alternate communication channel for severely disabled people but also to use BCIs for
communication and control in industrial environments and consumer applications [14].
It is also worth mentioning a recent technological development, closely related to BCI,
known as a brain-to-brain interface (BBI). A BBI is a combination of the brain–computer
interface and the computer–brain interface [15]. Brain-to-brain interfaces allow for direct
transmission of brain activity in real time by coupling the brains of two individuals.
The last decade witnessed an increasing interest towards the use of BCI for games and
entertainment applications [16] given the amount of meaningful information provided by
BCI devices not easily achievable by other input modalities. BCI signals can be utilized
in this kind of application to collect data to describe the cognitive states of a user, which
proved useful in adapting a game to the player’s emotional and cognitive conditions.
Moreover, research in BCI can provide game developers with information that is relevant
to controling a gaming application or to developing hardware and software products to
build an unobtrusive interface [17].
The use of mental states to trigger surroundings and to control the external envi-
ronment can be achieved via passive brain–computer interfaces (pBCIs) to replace a lost
function in persons with severe motor disabilities and without possibility of function
recovery (i.e., amyotrophic lateral sclerosis or brainstem stroke) [18]. In such instances, BCI
is used as a system that allows for direct communication between the brain and distant
devices. Over the last years, research efforts have been devoted to its use in smart environ-
mental control systems, fast and smooth movement of robotic arm prototypes, as well as
motion planning of autonomous or semiautonomous vehicles and robotic systems [19,20].
BCI can be used in an active way, known as active BCI, leaving the user voluntarily mod-
ulating brain activity to generate a specific command to the surrounding environment,
replacing or partially restoring lost or impaired muscular abilities. The alternative modality,
pBCIs [21–23], is equipped to derive its outputs from arbitrary brain activity without the
intention of a specific voluntary control (i.e., it exploits implicit information on the user
states). In fact, in systems based on pBCIs, the users do not try to control their brain
activity [24]. pBCIs have been used in modern research on adaptive automation [25] and
in augmented user’s evaluation [26]. A recent study demonstrated the higher resolution
of neurophysiological measures in comparison to subjective ones and how the simultane-
ous employment of neurophysiological measures and behavioral ones could allow for a
Electronics 2021, 10, 560 3 of 43
essential for near-infrared spectroscopy analytic methods. In general, BCI systems are
classified with respect to the techniques used to pick up the signals as invasive, when the
brain signals are picked on the basis of surgery, or non- or partially invasive.
Noninvasive implants are easy to wear, but they produce poor signal resolution
because the skull dampens signals, dispersing and blurring the electromagnetic waves
created by the neurons. Even though brain waves can still be detected, it is more difficult to
determine the area of the brain that created the recorded signals or the actions of individual
neurons. ECoG is the most studied noninvasive interface, mainly due to its fine temporal
resolution, ease of use, portability, and low setup cost. A typical EEG recording setup is
shown in Figure 1.
Figure 1. The most widely used method for recording brain activity in brain–computer interfacing is
EEG, as it is a technique that is simple, noninvasive, portable, and cost-effective. Typical electroen-
cephalogram recording setup: cap carrying on contact electrodes and wires (from the Department
of Electrical and Electronic Engineering at the Tokyo University of Agriculture and Technology).
The wires are connected to amplifiers that are not shown in the figure. Amplifiers also improve the
quality of acquired signals through filtering. Some amplifiers include analog-to-digital converters to
allow brain signals to be acquired by (and stored on) a computer.
The brain is extremely complex (there are about 100 billion neurons in a human brain,
and each neuron is constantly sending and receiving signals through an intricate network
of connections). Assuming that all thoughts or actions are encoded in electric signals in the
brain is a gross understatement, as there are chemical processes involved as well, which
EEGs are unable to pick up on. Moreover, EEG signal recording is weak and prone to
interference: EEGs measure tiny voltage potentials while the blinking eyelids of the subject
can generate much stronger signals. Another substantial barrier to using EEG as a BCI is
the extensive training required before users can effectively work with such technology.
ECoG measures the electrical activity of the brain taken from beneath the skull in
a similar way to noninvasive electroencephalography, but the electrodes are embedded
in a thin, plastic pad that is placed directly above the cortex, beneath the dura mater.
The survey in [49] overviews the progresses in a recent field of applied research, micro-
corticography (µECoG). Miniaturized implantable µECoG devices possess the advantage
of providing greater-density neural signal acquisition and stimulation capabilities in a
minimally invasive implant. An increased spatial resolution of the µECoG array is useful
for greater specificity diagnosis and treatment of neuronal diseases. In general, invasive
devices to measure brain signals are based on electrodes directly implanted in the patient’s
grey matter of the brain during neurosurgery. Because the electrodes lie in the grey matter,
invasive devices produce the highest-quality signals of BCI devices but are prone to scar-
Electronics 2021, 10, 560 5 of 43
tissue build-up, causing the signal to become weaker or even null as the body reacts to a
foreign object in the brain. Partially invasive BCI devices are implanted inside the skull but
rest outside the brain rather than within the grey matter. They produce better resolution
signals than noninvasive BCIs where the bone tissue of the cranium deflects and deforms
signals and have a lower risk of forming scar-tissue in the brain than fully invasive BCIs.
Invasive and partially invasive technologies remain limited to healthcare fields [50].
Medical grade brain–computer interfaces are often used in assisting people with damage to
their cognitive or sensorimotor functions. Neuro-feedback is starting to be used for stroke
patients by physical therapists to assist in visualizing brain activities and in promoting
the brain [51]. Such plasticity enables the nervous system to adapt to the environmental
pressures, physiologic changes, and experiences [52]. Because of the plasticity of the brain,
the other parts of the brain with no damages take over the functions disabled by the
suffered injuries. Indeed, medical-grade BCI was originally designed for the rehabilita-
tion of patients following paralysis or loss of limbs. In this way, the control of robotic
arms or interaction with computerized devices could be achieved by conscious thought
processes, whereby users imagine the movement of neuroprosthetic devices to perform
complex tasks [53]. Brain–computer interfacing has also demonstrated profound benefits
in correcting blindness through phosphine generation, thereby introducing a limited field
of vision to previously sightless patients [54]. Studies have shown that those patients with
access to BCI technologies, get faster recoveries from serious mental and physical traumas
compared to those who undergo traditional rehabilitation methods [55]. For this very
reason, the use of BCI has also expanded (albeit tentatively) into the fields of Parkinson’s
disease, Alzheimer’s disease, and dementia research [56,57].
A class of wireless BCI devices were designed as pill-sized chips of electrodes im-
planted on the cortex [58]. Such a small volume houses an entire signal processing system:
a lithium ion battery, ultralow-power integrated circuits for signal processing and conver-
sion, wireless radio and infrared transmitters, and a copper coil for recharging. All the
wireless and charging signals pass through an electromagnetically transparent sapphire
window. Not all wireless BCI systems are integrated and fully implantable. Several proof of
concept demonstrations have shown encouraging results, but barriers to clinical translation
still remain. In particular, intracortical prostheses must satisfy stringent power dissipation
constraints so as not to damage the cortex [59]. Starting from the observation that approx-
imately 20% of traumatic cervical spinal cord injuries result in tetraplegia, the authors
of [60] developed a semi-invasive technique that uses an epidural wireless brain–machine
interface to drive an exoskeleton. BCI technology can empower individuals to directly
control electronic devices located in smart homes/offices and associative robots via their
thoughts. This process requires efficient transmission of ECoG signals from implanted
electrodes inside the brain to an external receiver located outside on the scalp. The contribi-
tion [61] discusses efficient, low complexity, and balanced BCI communication techniques
to mitigate interferences.
Cognitive brain systems are amenable to conscious control, yielding better regulation
of magnitude and duration of localized brain activity. Signal generation, acquisition,
processing, and commanding chain in a BCI system is illustrated in Figure 2. Likewise, the
visual cortex is a focus of signal acquisition since the electrical signals picked up from the
the visual cortex tend to synchronize with external visual stimuli hitting the retina.
Evoke
Brain
activity
Evoke Stimulus
Cognition
Figure 2. Basic brain–computer interface (BCI) schematic: How targeted brain oscillation signals (or brainwaves) originate
from a visual stimulus or a cognitive process and how they get acquired, processed, and translated into commands.
One of the most significant obstacles that must be overcome in pursuing the utilization
of brain signals for control is the establishment of a valid method to extract event-related
information from a real-time EEG [63]. Most BCIs rely on one of three types of mental
activities, namely, motor imagery [64], P300 [65], and steady-state visually evoked poten-
tials (SSVEPs) [66]. Some BCI may utilize more than one such mental activity; hence, they
are referred to as “hybrid BCIs” [67]. Once brain signal patterns are translated in relation
to cognitive tasks, BCI systems can decode the user’s goals. By manipulating such brain
signals, patients can express their intent to the BCI system and said brain signals can act as
control signals in BCI units.
Some users experience significant difficulty in using BCI technologies. It is reported
that approximately 15–30% of users cannot modulate their brain signals, which results in
the inability to operate BCI systems [68]. Such target users are called “BCI-illiterate” [69].
The sensorimotor EEG changes of the motor cortex during active and passive movement
and motor imagery are similar. The study [70] showed that it is possible to use classifiers
calculated with data from passive and active hand movement to detect motor imagery.
Hence, a physiotherapy session for a stroke patient could be used to obtain data to learn a
classifier and the BCI-rehabilitation training could start immediately.
P300 is a further type of brain activity that can be detected by means of EEG record-
ings. P300 is a brainwave component that occurs after a stimulus that is deemed “im-
portant”. The existence of the P300 response may be verified by the standard “oddball
paradigm”, which consists in the presentation of a deviant stimulus within a stream of
standard stimuli, which elicits the P300 component. In the EEG signal, P300 appears as a
positive wave 300 ms after stimulus onset and serves as a link between stimulus charac-
teristics and attention. In order to record P300 traces, the electrodes are placed over the
posterior scalp. Attention and working memory are considered as cognitive processes
underlying P300 amplitude [71]. In fact, it was suggested that P300 is a manifestation
of a context-updating activity occurring whenever one’s model of the environment is
revized. The study in [71] investigated the support of attentional and memory processes in
controlling a P300-based BCI in people with amyotrophic lateral sclerosis.
The study in [72] investigated a BCI technology based on EEG responses to vibro-
tactile stimuli around the waist. P300-BCIs based on tactile stimuli have the advantage
Electronics 2021, 10, 560 7 of 43
of not taxing the visual or auditory system, hence being especially suitable for patients
whose vision or eye movements are impaired. The contribution of [73] suggested a method
for the extraction of discriminative features in electroencephalography evoked-potential
latency. Based on offline results, evidence is presented indicating that a full surround
sound auditory BCI paradigm has potential for an online application. The auditory spatial
BCI concept is based on a directional audio stimuli delivery technique, which employs a
loudspeaker array. The stimuli presented to the subjects vary in frequency and timbre.
Such research resulted in a methodology for finding and optimizing evoked response
latencies in the P300 range in order to classify the subject’s chosen targets (or the ignored
non-targets).
The SSVEP paradigm operates by exposing the user to oscillating visual stimuli
(e.g., flickering Light Emitting Diodes (LEDs) or phase-reversing checkerboards). An
electrical activity corresponding to the frequency of such oscillation (and its multiples) can
be measured from the occipital lobe of the brain. The user issues a command by choosing
a stimulus (and therefore a frequency). Figure 3 shows an experimental setup where a
patient gazes at a screen that presents eight different oscillating patterns.
Figure 3. Experimental setup of a steady-state visually evoked potential (SSVEP) recording: A patient
gazes at a screen that presents eight different patterns corresponding to eight different intended
commands to the interface. Each pattern oscillates with a different frequency: whenever the patient
gazes at a particular pattern, neuronal oscillations take place in the visual cortex, which locks up with
the frequency of the flickering pattern (from the Department of Electrical and Electronic Engineering
at the Tokyo University of Agriculture and Technology).
BCI includes significant target systems out of clinical scopes. Based on different
frequency flicker through visual evocations, SSVEP systems can be used to provide different
inputs in control applications. Martišius et al. in [74] described SSVEP as a mean to
successfully control computer devices and games [74]. Specifically, in [74], SSVEP is used
to build up a human–computer interaction system for decision-making improvement. In
particular, in this work, a model of traffic lights is designed as a case study. The experiments
carried out on such model, which involved decision-making situations, allowed a SSVEP–
BCI system to assist people to make a decision correctly.
in asynchronous protocols, the user can think of some mental tasks at any time (the user
controls the system). The asynchronous BCIs [28,76] require extensive training (of the
order of many weeks), their performance is user-dependent, and their accuracy is not
as high as that of synchronous one [78]. On the other hand, synchronous BCIs require
minimal training and have stable performance and high accuracy. Asynchronous BCI
is more realistic and practical than a synchronous system in that BCI commands can be
generated whenever the user wants [79].
Asynchronous BCI systems are more practicable than synchronous ones in real-world
applications. A key challenge in asynchronous BCI design is to discriminate intentional con-
trol and non-intentional control states. In the contribution [66], a two-stage asynchronous
protocol for an SSVEP-based BCI was introduced. This visual-evoked potential-based
asynchronous BCI protocol was extended to mixed frequency and phase-coded visual
stimuli [80]. Brain–computer interfacing systems can also use pseudo-random stimulation
sequences on a screen (code-based BCI) [81]. Such a system can be able to control a robotic
device. In this case, the BCI controls may be overlaid on the video that shows a robot
performing certain tasks.
Hybrid BCIs combine different input signals to provide more flexible and effective
control. The combination of these input signals makes it possible to use a BCI system for a
larger patient group and to make the system faster and more reliable. Hybrid BCIs can also
use one brain signal and a different type of input, such as an electrophysiological signal
(e.g., the heart rate) or a signal from an external device such as an eye-tracking system. For
instance, a BCI can be used as an additional control channel in a video game that already
uses a game pad or it can complement other bio-signals such as electrocardiogram or blood
pressure in an application monitoring a driver’s alertness [14]. The contribution of [67]
describes BCIs in which functioning is based on the classification of two EEG patterns,
namely, the event-related (de)synchronisation of sensorimotor rhythms and SSVEPs.
over the frontal electrodes. Such results highlighted that the proposed algorithm can
provide comparable performance in terms of blink correction, without requiring EOG
channels, a high number of electrodes, or a high computational effort, thus preserving
EEG information from blink-free signal segments. More generally, blind source separation
(BSS) is an effective and powerful tool for signal processing and artifact removal from
electroencephalographic signals [86]. For high-throughput applications such as BCIs,
cognitive neuroscience, or clinical neuromonitoring, it is of prime importance that blind
source separation is effectively performed in real time. In order to improve the throughput
of a BSS-based BCI in terms of speed, the optimal parallelism environment that hardware
provides may be exploited. The obtained results show that co-simulation environment
greatly reduces computation time.
Once the features of acquired brain signals are extracted by means of a signal process-
ing algorithm, they can be classified using an intelligent/adaptive classifier. Typically, a
choice between different classifiers is performed, although ways of combining predictions
from several nonlinear regression techniques can be exploited. Some research endeavors
have focused on the application of dynamic models such as the combined Hidden Markov
Autoregressive model [87] to process and classify the acquired signals, while others have
focused on the use of Hidden Markov Models and Kalman filters [88]. The contribution
in [89] suggested to eliminate redundancy in high-dimensional EEG signals and reduce
the coupling among different classes of EEG signals by means of the principal compo-
nent analysis and to employ Linear Discriminant Analysis (LDA) to extract features that
represent the raw signals; subsequently, a voting-based extreme learning machine (ELM)
method was made use of to classify such features. The contribution in [90] proposed the
use of indexes applied to BCI recordings, such as the largest Lyapunov exponent, the
mutual information, the correlation dimension, and the minimum embedding dimension,
as features for the classification of EEG signals. A multi-layer perceptron classifier and
a Support-Vector Machine (SVM) classifier based on k-means clustering were used to
accomplish classification. A support-vector machine classification based method applied
to P300 data was also discussed in the contribution [91].
been damaged; into the cochlear nucleus; or into the inferior colliculus [96]. BCI could
prove useful in psychology, where it might help in establishing the psychological state of
patients, as described in [97]. A patient’s feeling is predictable by examining the electrical
signals generated by the brain. Krishna et al. [97] proposed an emotion classification tool
by developing generalized mixture model and obtained 89% classification precision in
recognizing happiness, sadness, boredom, and neutral states in terms of valency. Further
applications have been proposed in the biometric field, creating a EEG-based cryptographic
authentication scheme [98] through EEG signals, getting important results based on the
commitment scheme adopted from [99].
In recent years, the motivation for developing brain–computer interfaces has been
not only to provide an alternate communication channel for severely disabled people but
also to use BCIs for communication and control in industrial environments and consumer
applications [14]. BCI is useful not only for communication but also to allow mental states
of the operator to trigger the surroundings [18]. In such instances, BCI is used as a system
that allows a direct communication between the brain and distant devices. Over the last
years, research efforts have been paid on its use in smart environmental control systems,
fast and smooth movement of robotic arm prototypes, as well as motion planning of
autonomous or semi-autonomous vehicles. For instance, one study developed an SSVEP–
BCI for controlling a toy vehicle [100]. Figure 4 shows an experimental setup for controlling
a toy vehicle through SSVEP–BCI technology.
(a) SSVEP-based BCI (b) A toy vehicle controlled through a SSVEP-BCI technology
Figure 4. An experimental setup for controlling a vehicle through a steady-state visually-evoked-potentials-based BCI
(from the Department of Electrical and Electronic Engineering at the Tokyo University of Agriculture and Technology).
quired in areas of usability, hardware/software, and system integration, but for successful
development, a BCI should also take user characteristics and acceptance into account.
Efforts have been focused on developing potential applications in multimedia com-
munication and relaxation (such as immersive virtual reality control). Computer gaming,
in particular, has benefited immensely from the commercialization of BCI technology,
whereby users can act out first-person roles through thought processes. Indeed, using brain
signals to control game play opens many possibilities beyond entertainment, since neu-
rogaming has potentials in accelerating wellness, learning, and other cognitive functions.
Major challenges must be tackled for BCIs to mature into an established communications
medium for virtual-reality applications, which range from basic neuroscience studies to
developing optimal peripherals and mental gamepads and more efficient brain-signal
processing techniques [103].
An extended BCI technology is called the “collaborative BCI” [104]. In this method,
the tasks are performed by multiple people rather than just one person, which would make
the result more efficient. Brain-computer interfacing relies on the focused concentration
of a subject for optimal performance, while, in the collaborative BCI, if one person loses
focus in concentration for any reason, the other subjects will compensate and produce
the missing commands. Collaborative BCI found widespread applications in learning;
communication; as well as in improving social, creative, and emotional skills. The authors
of the paper [105] built a collaborative BCI and focused on the role of each subject in a
study group, trying to answer the question of whether there are some subjects that would
be better to remove or that are fundamental to enhance group performance. The goal of
the paper in [106] was to introduce a new way to reduce the time identifying a message or
command. Instead of relying on brain activity from one subject, the system proposed by the
authors of [106] utilized brain activity from eight subjects performing a single trial. Hence,
the system could rely on an average based on eight trials, which is more than sufficient
for adequate classification, even though each subject contributed only one trial. The paper
in [107] presents a detailed review about BCI applications for training and rehabilitation of
students with neurodevelopmental disorders.
Machine-learning and deep learning approaches based on the analysis of physiological
data play a central role since they can provide a means to decode and characterize task-
related brain states (i.e., reducing from a multidimensional to one-dimensional problem)
and to differentiate relevant brain signals from those that are not task-related. In this regard,
researchers tested BCI systems in daily life applications, illustrating the development and
effectiveness of this technique [18,108,109]. EEG-based pBCI became a relevant tool for
real-time analysis of brain activity since it can potentially provide information about the
operator cognitive state without distracting the user from the main task and out of any
conditioning from a subjective judgment of an observer or of the user itself [25]. Another
growing area for BCIs is mental workload estimation, exploring the effects of mental
workload and fatigue upon the P300 response (used for word spell BCI) and the alpha-
theta EEG bands. In this regard, there is currently a movement within the BCI community
to integrate other signal types into “hybrid BCIs” [67] to increase the granularity of the
monitored response.
the EEG signals are usually picked up by multiple electrodes [2,123,124]. Concerning EEG
systems, currently, research is being conducted toward producing dry sensors (i.e., without
any conductive gel) and eventually a water-based technology instead of the classical gel-
based technology, providing high signal quality and better comfort. As reported in [125],
three different kinds of dry electrodes were compared against a wet electrode. Dry electrode
technology showed excellent standards, comparable to wet electrodes in terms of signals
spectra and mental state classification. Moreover, the use of dry electrodes reduced the
time taken to apply sensors, hence enhancing a user comfort. In the case of multi-electrode
measurement, the International 10-20 [126], Extended 10-20 [127], International 10-10 [128],
and the International 10-5 [129,130] methods have stood as the de facto standards for
electrode arrangement.
In these systems, the locations on a head surface are described by relative distances
between cranial landmarks. In the context of the International 10-20 system, for example,
the landmarks are a nasal point located between the eyes, the height of eyes, and the
inion, which is the most prominent projection of the occipital bone at the posterioinferior
(lower rear) part of the human skull [126]. The line connecting these landmarks along the
head surface may lead to marks that divide the line into short segments of 10% and 20% of
the whole length. Each pick-up point is defined as the intersection of the lines connecting
these marks along the head surface. Typical electrodes arrangements are shown in Figure 5.
Figure 5. The electrode arrangement of the International 10-20 method and the International 10-10
method. The circles show the electrodes defined by the International 10-20 method. The circles and
the crosses show the electrodes defined by the International 10-10 method.
The EEG systems record the electrical potential difference between any two elec-
trodes [126]. Referential recording is a method that reads the electrical potential difference
between a target electrode and a reference electrode, and the reference electrodes are com-
mon among all electrodes. Earlobes that are considered electrically inactive are widely
used to place the reference electrodes. In contrast, bipolar recording reads the electrical
potential difference between electrodes located on head surface.
The electroencephalogram recording systems are generally more compact than the
systems used in functional magnetic resonance, near infrared spectroscopy, and magnetic
resonance. Two further important aspects of brain signal acquisition are temporal and
spatial resolution. Spatial resolution refers to the degree of accuracy with which brain
activity can be located in space, while temporal resolution refers to the degree of accuracy,
Electronics 2021, 10, 560 15 of 43
on a temporal scale, with which a functional imaging technique can describe a neural
event [131].
The temporal resolution of electroencephalography is higher than the temporal reso-
lution afforded by functional magnetic resonance and by near infrared spectroscopy [132].
However, since it is difficult to make electrodes smaller, the spatial resolution of elec-
troencephalography is very low compared to spatial resolution afforded by other devices.
In addition, the high-frequency components of the electrical activity of the neurons de-
crease in the observed signals because the physiological barriers between the emitters
and the receivers, such as the skull, work as lowpass filters. The signals picked up by
the electrodes are contaminated by the noise caused by poor contacting of the electrodes
with the skin, by muscle movements (electromyographic signals), and by eye movements
(electrooculographic signals).
5
Target
NonTarget
4
Amplitude [uV]
-1
0 0.2 0.4 0.6 0.8 1
Time (s) after stimulus
Figure 6. Examples of event-related potentials: P300 waveforms. The illustrated signals were
averaged over 85 trials. “Target” is the observed signal corresponding to the subject’s perception after
displaying a stimulus. “Non Target” is the observed signal corresponding to the subject’s perception
corresponding to a stimulus ignored by the subject.
of 500 to 1000 ms [135], as illustrated in Figure 7. The user gazes at a target symbol on
the screen. When the row or column including the target symbol flashes, the user counts
incidences of the flash. By detecting a P300 at the flash of row and column, the interface
can identify the symbol that the user is gazing at.
1 2 3 1 2 3 1 2 3
1 2 3
4 5 6 4 5 6 4 5 6
4 5 6
7 8 9 7 8 9 7 8 9
7 8 9
0 0 0
0
1 2 3 1 2 3 1 2 3
4 5 6 4 5 6 4 5 6
7 8 9 7 8 9 7 8 9
0 0 0
Figure 7. Stimuli in the telephone-call brain–computer interface. According to the order indicated by
the arrows, each row and line flashes.
Although the majority of P300-based BCIs use P300 responses to visual stimuli, the
visual stimuli are not suitable for patients whose vision or eye movement are impaired. For
such users, alternative BCIs that use either auditory or tactile stimuli have been developed.
Several giant leaps have been made in the BCI field in the last years from several points
of view. For example, many works have been produced in terms new BCI spellers. The
Hex-O-Spell, a gaze-independent BCI speller that relies on imaginary movement, was
first explained in 2006 by Blankertz et al. [136] and presented in [76,137]. As reported
in [138], the first variation of the Hex-O-Speller was used as an ERP P300 BCI system
and was compared with other variations of the Hex-O-Spell utilizing ERP systems (Cake
Speller and Center Speller) [139]. These two GUIs were developed to be compared with the
Hex-O-Spell ERP in [138] for gaze-independent BCI spellers. In [138], the Hex-O-Spell was
transformed into an ERP system, to test when ERP spellers could also be gaze-independent.
The purpose was to check if BCI-spellers could replace eye tracker speller systems. The
aforementioned change in the design could improve the performance of the speller and
could provide a more useful control without the need for spatial attention. A typical
implementation of auditory P300-based interfaces exploits spatially distributed auditory
cues, generated via multiple loudspeakers [140,141]. Tactile P300-based BCIs are also
effective as an alternative to visual-stimuli-based interfaces [72]. In said kind of BCI,
vibrating cues are given to users by multiple tactors.
The detection of a P300 response is not straightforward due to the very low signal-
to-noise ratio of the observed EEG readouts; thus, it is necessary to record several trials
for the same target symbol. Averaging over multiple trials as well as lowpass filtering
are essential steps to ensure the recognition of P300 responses. A statistical approach is
used for the detection of a P300 response, such as linear discriminant analysis followed
by dimension reduction including downsampling and principal component analysis, as
well as convolutional neural networks [142]. The stepwise linear discriminant analysis is
widely used as an implementation of LDA. Moreover, the well-known SVM algorithm is
applicable to classify a P300 response.
Electronics 2021, 10, 560 17 of 43
Rapid serial visual presentation appears to be one of the most appropriate paradigms
for patients using a P300-based brain–computer interface, since ocular movements are
not required. However, the use of different locations for each stimulus may improve the
overall performance. The paper in [143] explored how spatial overlap between stimuli
influences performance in a P300-based BCI. Significant differences in accuracy were found
between the 0% overlapped condition and all the other conditions, and between 33.3% and
higher overlap, namely 66.7% and 100%. Such results were explained by hypothesizing a
modulation in the non-target stimulus amplitude signal caused by the overlapping factor.
Gazing at a 12 Hz stimulus
Idle
Amplitude
6 8 10 12 14 16 18
Frequency [Hz]
Figure 8. Example of an SSVEP: Power spectrum of the signal observed when a subject gazes at a
visual pattern (flickering at 12 Hz) compared to the power spectrum of the signal observed when the
subject does not receive any stimulus (idle).
Visual flickering stimuli can be generated by LEDs [145] or computer monitors [146].
The use of LEDs has the advantage of affording the display of arbitrary flickering patterns
and the drawback that the system requires a special hardware, while computer monitors
are controlled by a software but the flickering frequency is strongly dependent on their
refresh rate. It is important to consider the safety and comfort of visual stimuli: modulated
visual stimuli at certain frequencies can provoke epileptic seizures [147] while bright flashes
and repetitive stimulation may impair the user’s vision and induce fatigue [148].
A BCI can utilize flickering visual stimuli that elicit SSVEPs to implement multiple
command inputs. Such an interface has multiple visual targets with different flickering
patterns (typically with different frequencies within the range 3–70 Hz [148–150]) for a
user to gaze at. By detecting the frequency of the recorded evoked potential, the inter-
face determines the user-intended command. An example of the arrangement of multiple
checkerboards on a monitor is illustrated in Figure 9, where six targets are displayed on-
screen. Each target makes a monochrome inversion in the checkerboard at different
frequencies, as illustrated in Figure 10. When the user gazes at the stimulus flickering at
Electronics 2021, 10, 560 18 of 43
and the reference Fourier series with the fundamental frequency f , denoted by
y(t) = [sin(2π · f t), cos(2π · f t), sin(2π · 2 f t), cos(2π · 2 f t), . . . , sin(2π · L f t), cos(2π · L f t)], (2)
where L denotes the number of harmonics and is calculated for all candidate frequencies,
f = f 1 , . . . , f N , where N is the number of visual targets. The frequency that gives the
maximum canonical correlation is recognized as the frequency of the target that the user is
gazing at. This approach can be improved by applying a bank of bandpass filters [157,158].
Recent studies have pointed out that the nonuniform spectra of the spontaneous or back-
ground EEG can deteriorate the performances of the frequency recognition algorithm and
have proposed efficient methods to achieve frequency recognition based on the effects of
background EEG [159–161].
Electronics 2021, 10, 560 19 of 43
In the CCA method to identify the target frequency, the reference signal, y(t), can be
replaced by calibration signals, which are EEG signals to each target collected with multiple
trials. There are several studies on data-driven methods for the CCA [80,162]. Recently, it
has been reported [163–165] that, by utilizing the correlation between trials with respect to
the same target, recognition accuracy can be improved. This method is called task-related
component analysis [166], which can be applied to detection the target frequency as well
as the target phase.
Readers may wonder if this type of interface can be replaced by eye-tracking devices,
as the latter are much simpler to implement. Indeed, the answer to such a question is
mixed. For example, the comparative study in [167,168] has suggested that, for small
targets on the screen, SSVEP-BCI achieves higher accuracy in command recognition than
eye-tracking-based interfaces.
SSVEP-based BCIs suffer from a few limitations. The power of the SSVEP response is
very weak when the flicker frequency is larger than 25 Hz [169]. Moreover, every SSVEP
elicited by a stimulus with a certain frequency includes secondary harmonics. As men-
tioned earlier, a computer monitor can generate visual stimuli with frequencies limited
by the refresh rate; hence, possible frequencies that could be used to form visual targets
are strictly limited [146,170]. In order to overcome the limitation of using targets flicker-
ing with different frequencies (called “frequency modulation” [171]), several alternative
stimuli have been proposed. Time-modulation uses the flash sequences of different tar-
gets that are mutually independent [172]. Code-modulation is another approach that uses
pseudo-random sequences, which are typically the “m-sequences” [81]. A kind of shift-
keying technique (typically used in digital communications) is also used to form flickering
targets [170]. Another paradigm is waveform-modulation [173], where various types of
periodic waveforms, such as rectangle, sinusoidal, and triangle waveforms, are used as
stimuli. In addition, it has been experimentally verified that different frequencies as well
as phase-shifts allocated to visual targets greatly increase the number of targets [161,174].
Surveys that illustrate several methods and paradigms for stimulating visually evoked
potentials are found in [148,175,176].
Additionally, a type of SSVEP obtained by turning the user’s attention to the repeat of
a short-term stimulus—such as a visual flicker—can be utilized. The repeat of tactile [177]
and auditory [178,179] stimuli can likewise evoke such potentials.
Pevent − Pref
ERD = , (3)
Pref
where the quantity Pevent denotes the power within the frequency band of interest in the
period after the event and the quantity Pref denotes the power within the frequency band
of interest in the preceding baseline (or reference) period.
Electronics 2021, 10, 560 20 of 43
CP3
CP4
140
ERD [%]
100
60
CP3
CP4
140
ERD [%]
100
60
−1 0 1 2 3
Time [sec]
Figure 11. Event-related desynchronisation elicited by the motor imagery tasks of moving the left
hand (top) and the right hand (bottom) at electrodes CP3 and CP4.
The EEG signals were recorded while a subject performed the tasks of motor imagery
of their left and right hands [188]. After the recording, the EEG signals were bandpass-
filtered with a pass-band of 8–30 Hz. The Figure 11 illustrates the EEG power averaged
over 100 trials of the task of motor imagery of the left or right hand withnormalization by a
base energy corresponding to signals measured before performing motor imagery tasks.
(Note that the ERD is defined as the relative power to the baseline power; therefore, if the
current power is larger than the base power, the ERD can be over 100%). The symbol “0” on
the horizontal axis is the time when the subject started performing the tasks. The decrease
in the energy while the subject performs the tasks can be clearly observed in Figure 11.
One of the merits of MI-BCI against the BCIs based on the perception of flickering
stimuli is that it is unnecessary to make use of a device to display the stimuli. Moreover, it
has been reported recently that the detection of the motor imagery tasks and a feedback are
useful for the rehabilitation of patients who suffer from motor disorders caused by brain
injuries [189–192]. The MI-BCI can be utilized in rehabilitation to recover motor functions
as follows. One of the rehabilitation procedures for the recovery of motor functions is to
make a subject perform movements of a disabled body part by a cue and to give visual
or physical feedback [189]. The coincident events of the intention of the movement that
the subject has and the corresponding feedback to the subject are supposed to promote
plasticity of the brain.
In rehabilitation, to generate the feedback coincidentally with the intention that elicited
it, the intention is considered significant. In the general procedure of the rehabilitation
illustrated above, the cue control generates the intention, which is detected from the EEG to
give the feedback to the patient. When using the motor-imagery BCI for the rehabilitation,
the feedback generation is controlled by the interface. In fact, the MI-BCI enables the
rehabilitation system to detect the intention of a movement and to generate the feedback at
an appropriate time. Some researches have suggested that rehabilitation based on MI-BCIs
Electronics 2021, 10, 560 21 of 43
can promote the plasticity of the brain more efficiently than the conventional systems based
on the cue [189–192].
A further promising paradigm related to the ERD is passive movement (PM). Brain–
computer interfaces exploiting passive movements are instances of passive BCIs. The
passive movement is typically performed with a mechatronic finger rehabilitation de-
vice [193]. An early report about observing the effect in the brain cortical activity during
PM suggested that PMs consisting of brisk wrist extensions done with the help of a pulley
system resulted in significant ERD after the beginning of the movement, followed by ERS in
the beta band [194]. A recent study reported that the classification accuracy calculated from
EEG signals during the passive and active hand movements did not differ significantly
from the classification accuracy for detecting MI. It has been reported that ERD is induced
not only by MI but also by either passive action observation (AO) or a combination of MI
with AO [195–197]. The PM and AO gives less fatigues to users; therefore, they are very
promising for rehabilitation purposes.
A well-known method for extracting brain activity that is used in MI-BCIs is based
on the notion of common spatial pattern (CSP) [132,198,199]. The CSP consists of a set of
spatial weight coefficients corresponding to electrodes recording a multichannel EEG.
These coefficients are determined from a measured EEG in such a way that the variances of
the signal extracted by the spatial weights maximally differ between two tasks (e.g., left-
and right-hand movement imagery). Specifically, the CSP is given as follows: let C1 and
C2 be the spatial covariance matrices of a multichannel EEG recording during two tasks
(task 1 and task 2, respectively). The CSP for task 1 is given as the generalized eigenvector
w corresponding to the maximum generalized eigenvalue λ of the matrix-pencil (C1 , C2 ):
C1 w = λ C2 w. (4)
Table 1. Cont.
an unshielded environment. Over the last years, research efforts have been devoted to
BCI applications to smart environmental control systems, fast and smooth robotic arm
prototypes, as well as motion planning of autonomous or semi-autonomous vehicles. The
world of BCI applications is expanding and new fields are opening as in communications,
control, and automation such as the control of unmanned vehicles [228]; virtual reality
applications in games [227]; environmental control; or improvements in brain control of
robotic devices. To better understand the new capabilities of EEG-based BCIs, applications
related to control and automation systems are summarized in the next subsections.
Figure 12. A prototypical three-wheel, small-sized robot for smart-home applications used to
perform experiments (from the Department of Information Engineering at the Università Politecnica
delle Marche).
In the studies of [234,235], the authors proposed the integration of the BCI technique
with universal plug and play (UPnP) home networking for smart house applications.
The proposed system can process EEG signals without transmitting them to back-end
personal computers. Such flexibility, the advantages of low-power-consumption and of
using small-volume wireless physiological signal acquisition modules, and embedded
signal processing modules make this technology be suitable for various kinds of smart
applications in daily life.
The study of [236] evaluated the performances of an EEG-based BCI system to control
smart home applications with high accuracy and high reliability. In said study, a P300-based
BCI system was connected to a virtual reality system that can be easily reconfigurable and
therefore constitutes a favorable testing environment for real smart homes for disabled
people. The authors of [237] proposed an implementation of a BCI system for controlling
wheelchairs and electric appliances in a smart house to assist the daily-life activities of its
users. Tests were performed by a subject achieving satisfactory results.
Virtual reality concerns human–computer interaction, where the signals extracted
from the brain are used to interact with a computer. With advances in the interaction
with computers, new applications have appeared: video games [227] and virtual reality
developed with noninvasive techniques [238,239].
to support people with motor disabilities to improve their quality of life, to enhance
the residual abilities, or to replace lost functionality [78]. For example, with regard to
individuals affected by neurological disabilities, the operation of an external robotic arm to
facilitate handling activities could take advantage of these new communication modalities
between humans and physical devices [22]. Some functions such as those connected with
the abilities to select items on a screen by moving a cursor in a three-dimensional scene is
straightforward using BCI-based control [77,241]. However, a more sophisticated control
strategy is required to accomplish the control tasks at more complex levels because most
external effectors (mechanical prosthetics, motor robots, and wheelchairs) posses more
degrees of freedom. Moreover, a major feature of brain-controlled mobile robotic systems
is that these mobile robots require higher safety since they are used to transport disabled
people [78]. In BCI-based control, EEG signals are translated into user intentions.
In synchronous protocols, usually P300 and SSVEP-BCIs based on external stimulation
are adopted. For asynchronous protocols, event-related de-synchronization, and ERS,
interfaces independent of external stimuli are used. In fact, since asynchronous BCIs do not
require any external stimulus, they appear more suitable and natural for brain-controlled
mobile robots, where users need to focus their attention on robot driving but not on
external stimuli.
Another aspect is related to two different operational modes that can be adopted in
brain-controlled mobile robots [78]. One category is called “direct control by the BCI”,
which means that the BCI translates EEG signals into motion commands to control robots
directly. This method is computationally less complex and does not require additional
intelligence. However, the overall performance of these brain-controlled mobile robots
mainly depends on the performance of noninvasive BCIs, which are currently slow and
uncertain [78]. In other words, the performance of the BCI systems limits that of the robots.
In the second category of brain-controlled robots, a shared control was developed, where a
user (using a BCI) and an intelligent controller (such as an autonomous navigation system)
share the control over the robots. In this case, the performance of robots depend on their
intelligence. Thus, the safety of driving these robots can be better ensured, and even the
accuracy of intention inference of the users can be improved. This kind of approach is less
compelling for the users, but their reduced effort translates into higher computational cost.
The use of sensors (such as laser sensors) is often required.
In fact, the system switches between the fast and the slow BCIs depending on the state
of the wheelchair. The paper [255] describes a brain-actuated wheelchair based on a
synchronous P300 neurophysiological protocol integrated in a real-time graphical scenario
builder, which incorporates advanced autonomous navigation capabilities (shared control).
In the experiments, the task of the autonomous navigation system was to drive the vehicle
to a given destination while also avoiding obstacles (both static and dynamic) detected
by the laser sensor. The goal/location was provided by the user by means of a brain–
computer interface.
The contributions of [256,257] describe a BCI based on SSVEPs to control the move-
ment of an autonomous robotic wheelchair. The signals used in this work come from
individuals who are visually stimulated. The stimuli are black-and-white checkerboards
flickering at different frequencies.
Asynchronous protocols have been suggested for the BCI-based wheelchair control
in [258–260]. The authors of [258] used beta oscillations in the EEG elicited by imagination
of movements of a paralysed subject for a self-paced asynchronous BCI control. The subject,
immersed in a virtual street populated with avatars, was asked to move among the avatars
toward the end of the street, to stop by each avatar, and to talk to them. In the experiments
described in [259], a human user makes path planning and fully controls a wheelchair
except for automatic obstacle avoidance based on a laser range finder. In the experiments
reported in [260], two human subjects were asked to mentally drive both a real and a
simulated wheelchair from a starting point to a goal along a prespecified path.
Several recent papers describe BCI applications where wheelchair control is multi-
dimensional. In fact, it appears that control commands from a single modality were not
enough to meet the criteria of multi-dimensional control. The combination of different
EEG signals can be adopted to give multiple control (simultaneous or sequential) com-
mands. The authors of [261,262] showed that hybrid EEG signals, such as SSVEP and
motor imagery, could improve the classification accuracy of brain–computer interfaces.
The authors of [263,264] adopted the combination of P300 potential and MI or SSVEP to
control a brain-actuated wheelchair. In this case, multi-dimensional control (direction and
speed) is provided by multiple commands. In the paper of [265], the authors proposed a
hybrid BCI system that combines MI and SSVEP to control the speed and direction of a
wheelchair synchronously. In this system, the direction of the wheelchair was given by left-
and right-hand imagery. The idle state without mental activities was decoded to keep the
wheelchair moving along the straight direction. Synchronously, SSVEP signals induced by
gazing at specific flashing buttons were used to accelerate or decelerate the wheelchair. To
make it easier for the reader to identify the references in this section, Table 2 summarizes
the papers about BCI applications presented in each subsection.
Electronics 2021, 10, 560 29 of 43
Application to Unmanned Application to Mobile Robotics and Application to Robotic Arms, Robotic Application to Wheelchair Control
Reference Paper
Vehicles and Robotics Interaction with Robotic Arms Tele-Presence and Electrical Prosthesis and Autonomous Vehicles
[115] X
[216] X
[217] X
[218] X
[219] X
[220] X
[221] X
[222] X
[223] X
[224] X
[225] X
[226] X
[227] X
[78] X
[34] X
[228] X
[229] X
[230] X
[231] X
[232] X
[132] X
[240] X
[78] X
[22] X
[77] X
[241] X
[123] X
[242] X
[240] X
[243] X
[244] X
Electronics 2021, 10, 560 30 of 43
Table 2. Cont.
Application to Unmanned Application to Mobile Robotics and Application to Robotic Arms, Robotic Application to Wheelchair Control
Reference Paper
Vehicles and Robotics Interaction with Robotic Arms Tele-Presence and Electrical Prosthesis and Autonomous Vehicles
[245] X
[246] X
[247] X
[248] X
[249] X
[1] X
[250] X
[251] X
[252] X
[253] X
[255] X
[256] X
[257] X
[258] X
[259] X
[261] X
[262] X
[263] X
[264] X
[265] X
Electronics 2021, 10, 560 31 of 43
6. Conclusions
Because of its nature, BCI is conceived for a continuous interaction between the brain
and controlled devices, affording external activity and control of apparati. The interface
enables a direct communication pathway between the brain and the object to be controlled.
By reading neuronal oscillations from an array of neurons and by using computer chips
and programs to translate the signals into actions, a BCI can enable a person suffering
from paralysis to write a book or to control a motorized wheelchair. Current BCIs require
deliberate conscious thought, while future applications, such as prosthetic control, are
likely to work effortlessly. One of the major challenges in developing BCI technologies
has been the design of electrodes and surgical methods that are minimally invasive. In
the traditional BCI model, the brain accepts an implanted mechanical device and controls
Electronics 2021, 10, 560 32 of 43
the device as a natural part of its representation of the body. Much current research is
focused on the potential of noninvasive BCI. Cognitive-computation-based systems use
adaptive algorithms and pattern-matching techniques to facilitate communication. Both
the user and software are expected to adapt and learn, making the process more efficient
with practice.
Near-term applications of BCIs are primarily task-oriented and are targeted to avoid
the most difficult obstacles in development. In the farther term, brain–computer interfaces
will enable a broad range of task-oriented and opportunistic applications by leveraging per-
vasive technologies to sense and merge critical brain, behavioral, task, and environmental
information [268].
The theoretical groundwork of the 1930s and 1940s and the technical advance of
computers in the following decades provided the basis for dramatic increases in human ef-
ficiency. While computers continue to evolve, the interface between humans and computers
has begun to present a serious impediment to full realization of the potential payoff [241].
While machine learning approaches have led to tremendous advances in BCIs in recent
years, there still exists a large variation in performance across subjects [269]. Understand-
ing the reasons for this variation in performance constitutes perhaps one of the most
fundamental open questions in the research on BCI. Future research on the integration of
cognitive computation and brain–computer interfacing is foreseen to be about how the
direct communication between the brain and the computer can be used to overcome this
impediment by improving or augmenting conventional forms of human communication.
Author Contributions: conceptualization, S.F., A.B., and T.T.; data curation, H.H. and T.T.; writing—
original draft preparation, A.B., S.F., and F.V.; writing—review and editing, S.F., A.B., T.T., H.H.,
and F.V.; supervision, S.F. and T.T. All authors have read and agreed to the published version of
the manuscript.
Funding: This research received no external funding.
Acknowledgments: The authors wish to thank the anonymous reviewers for their valuable com-
ments and suggestions as well as the editors of this Special Issue for the invitation to contribute.
Conflicts of Interest: The authors declare no conflict of interest.
Abbreviations
The following abbreviations are used in this manuscript:
Acronym Meaning
ALS Amyotrophic lateral sclerosis
AO Action observation
BBI Brain-to-brain interface
BCI Brain–computer interface
BMI Brain–machine interface
BSS Blind source separation
CCA Canonical correlation analysis
CSP Common spatial pattern
DBS Deep brain stimulation
ECoG Electrocorticogram
EEG Electroencephalogram
ELM Extreme learning machine
EOG Electrooculogram
EP Evoked potential
ER Evoked response
ERD Event-related desynchronisation
ERP Event-related potential
Electronics 2021, 10, 560 33 of 43
References
1. Pfurtscheller, G.; Neuper, C.; Birbaumer, N. Human Brain-Computer Interface; CRC Press: Boca Raton, FL, USA, 2005; pp. 1–35.
2. Wolpaw, J.R.; Birbaumer, N.; McFarland, D.J.; Pfurtscheller, G.; Vaughan, T.M. Brain-computer interfaces for communication and
control. Clin. Neurophysiol. 2002, 113, 767–791. doi:10.1016/S1388-2457(02)00057-3.
3. Nicolelis, M. Beyond Boundaries; St. Martin’s Press: New York, NY, USA, 2012.
4. Draguhn, A.; Buzsáki, G. Neuronal oscillations in cortical networks. Science 2004, 304, 1926–1929.
5. Thut, G.; Schyns, P.G.; Gross, J. Entrainment of perceptually relevant brain oscillations by non-invasive rhythmic stimulation of
the human brain. Front. Psychol. 2011, 2. doi:10.3389/fpsyg.2011.00170.
6. Cox, R.; van Driel, J.; de Boer, M.; Talamini, L.M. Slow Oscillations during Sleep Coordinate Interregional Communication in
Cortical Networks. J. Neurosci. 2014, 34, 16890–16901. doi:10.1523/JNEUROSCI.1953-14.2014.
7. Fontolan, L.; Krupa, M.; Hyafil, A.; Gutkin, B. Analytical insighFts on Theta-Gamma coupled neural oscillators. J. Math. Neurosci.
2013, 3. doi:10.1186/2190-8567-3-16.
8. Ritter, P.; Born, J.; Brecht, M.; Dinse, H.R.; Heinemann, U.; Pleger, B.; Schmitz, D.; Schreiber, S.; Villringer, A.; Kempter, R.
State-dependencies of learning across brain scales. Front. Comput. Neurosci. 2015, 9. doi:10.3389/fncom.2015.00001.
9. Santhanam, G.; Ryu, S.I.; Yu, B.M.; Afshar, A.; Shenoy, K.V. A high-performance brain-computer interface. Nature 2006,
442, 195–198. doi:10.1038/nature04968.
10. Jackson, M.M.; Mappus, R. Applications for brain-computer interfaces. In Brain-Computer Interfaces; Springer: London, UK, 2010;
pp. 89–103. doi:10.1007/978-1-84996-272-8_6.
11. Niedermeyer, E.; Da Silva, F. Electroencephalography: Basic Principles, Clinical Applications, and Related Fields; Lippincott Williams &
Wilkins: Philadelphia, PA, USA, 2005.
12. Sutter, E. The Brain Response Interface: Communication Through Visually-induced Electrical Brain Responses. J. Microcomput.
Appl. 1992, 15, 31–45. doi:10.1016/0745-7138(92)90045-7.
13. Becedas, J. Brain Machine Interfaces: Basis and Advances. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2012, 42, 825–836.
14. Erp, J.V.; Lotte, F.; Tangermann, M. Brain-computer interfaces: Beyond medical applications. Comput. IEEE Comput. Soc. 2012,
45, 26–34.
15. LaRocco, J.; Paeng, D.G. Optimizing Computer–Brain Interface Parameters for Non-invasive Brain-to-Brain Interface. Front.
Neuroinform. 2020, 14. doi:10.3389/fninf.2020.00001.
16. Bos, D.P.; Reuderink, B.; van de Laar, B.; Gürkök, H.; Mühl, C.; Poel, M.; Nijholt, A.; Heylen, D. Brain-Computer Interfacing and
Games. In Brain-Computer Interfaces—Applying our Minds to Human-Computer Interaction; Tan, D.S., Nijholt, A., Eds.; Human-
Computer Interaction Series; Springer: London, UK, 2010; pp. 149–178. doi:10.1007/978-1-84996-272-8_10.
17. Nijholt, A. BCI for games: A ‘state of the art’ survey. In Entertainment Computing—ICEC 2008, 7th International Conference,
Pittsburgh, PA, USA, 25–27 September 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 225–228.
18. Aricò, P.; Borghini, G.; Di Flumeri, G.; Sciaraffa, N.; Babiloni, F. Passive BCI beyond the LAB: Current Trends and Future
Directions. Physiol. Meas. 2018, 39. doi:10.1088/1361-6579/aad57e.
19. Xu, B.; Li, W.; He, X.; Wei, Z.; Zhang, D.; Wu, C.; Song, A. Motor Imagery Based Continuous Teleoperation Robot Control with
Tactile Feedback. Electronics 2020, 9. doi:10.3390/electronics9010174.
20. Korovesis, N.; Kandris, D.; Koulouras, G.; Alexandridis, A. Robot Motion Control via an EEG-Based Brain–Computer Interface
by Using Neural Networks and Alpha Brainwaves. Electronics 2019, 8, 1387. doi:10.3390/electronics8121387.
21. Blankertz, B.; Acqualagna, L.; Dähne, S.; Haufe, S.; Kraft, M.; Sturm, I.; Ušćumlic, M.; Wenzel, M.; Curio, G.; Müller, K. The Berlin
Brain-Computer Interface: Progress Beyond Communication and Control. Front. Neurosci. 2016, 10. doi:10.3389/fnins.2016.00530.
Electronics 2021, 10, 560 34 of 43
22. Bamdad, M.; Zarshenas, H.; Auais, M.A. Application of BCI systems in neurorehabilitation: a scoping review. Disabil. Reabil.
Assist. Technol. 2014, 10, 355–364. doi:10.3109/17483107.2014.961569.
23. Vansteensel, M.; Jarosiewicz, B. Brain-computer interfaces for communication. Handb. Clin. Neurol. 2020, 168, 65–85.
doi:10.1016/B978-0-444-63934-9.00007-X.
24. Aricò, P.; Sciafarra, N.; Babiloni, F. Brain–Computer Interfaces: Toward a Daily Life Employment. Brain Sci. 2020, 10, 157.
doi:10.3390/brainsci10030157.
25. Di Flumeri, G.; De Crescenzio, F.; Berberian, B.; Ohneiser, O.; Kramer, J.; Aricò, P.; Borghini, G.; Babiloni, F.; Bagassi, S.; Piastra, S.
Brain–Computer Interface-Based Adaptive Automation to Prevent Out-Of-The-Loop Phenomenon in Air Traffic Controllers
Dealing With Highly Automated Systems. Front. Hum. Neurosci. 2019, 13. doi:10.3389/fnhum.2019.00296.
26. Borghini, G.; Aricò, P.; Di Flumeri, G.; Sciaraffa, N.; Colosimo, A.; Herrero, M.T.; Bezerianos, A.; Thakor, N.V.; Babiloni, F. A new
perspective for the training assessment: Machine learning-based neurometric for augmented user’s evaluation. Front. Hum.
Neurosci. 2017, 11. doi:10.3389/fnins.2017.00325.
27. Aricò, P.; Reynal, M.; Di Flumeri, G.; Borghini, G.; Sciaraffa, N.; Imbert, J.P.; Hurter, C.; Terenzi, M.; Ferreira, A.; Pozzi, S.; et al.
How neurophysiological measures can be used to enhance the evaluation of remote tower solutions. Front. Hum. Neurosci. 2019,
13. doi:10.3389/fnhum.2019.00303.
28. Schettini, F.; Aloise, F.; Aricò, P.; Salinari, S.; Mattia, D.; Cincotti, F. Self-calibration algorithm in an asynchronous P300-based
brain–computer interface. J. Neural Eng. 2014, 11, 035004. doi:10.1088/1741-2560/11/3/035004.
29. Rezeika, A.; Benda, M.; Stawicki, P.; Gembler, F.; Saboor, A.; Volosyak, I. Brain–Computer Interface Spellers: A review. Brain Sci.
2018, 8, 57. doi:10.3390/brainsci8040057.
30. Diya, S.Z.; Prorna, R.A.; Rahman, I.; Islam, A.; Islam, M. Applying brain-computer interface technology for evaluation of user
experience in playing games. In Proceedings of the 2019 International Conference on Electrical, Computer and Communication
Engineering (ECCE), Cox’sBazar, Bangladesh, 7–9 February 2019.
31. Rosca, S.D.; Leba, M. Design of a brain-controlled video game based on a BCI system. MATEC Web Conf. 2019, 290, 01019.
doi:10.1051/matecconf/201929001019.
32. Abbasi-Asl, R.; Keshavarzi, M.; Chan, D. Brain-computer interface in virtual reality. In Proceedings of the 9th International
IEEE/EMBS Conference on Neural Engineering (NER), San Francisco, CA, USA, 20–23 March 2019; pp. 1220–1224.
33. Zhong, S.; Liu, Y.; Yu, Y.; Tang, J.; Zhou, Z.; Hu, D. A dynamic user interface based BCI environmental control system. Int. J. Hum.
Comput. Interact. 2020, 36, 55–66. doi:10.1080/10447318.2019.1604473.
34. Kim, M.; Kim, M.K.; Hwang, M.; Kim, H.Y.; Cho, J.; Kim, S.P. Online Home Appliance Control Using EEG-Based Brain–Computer
Interfaces. Electronics 2019, 8, 1101. doi:10.3390/electronics8101101.
35. Yu, Y.; Garrison, H.; Battison, A.; Gabel, L. Control of a Quadcopter with Hybrid Brain-Computer Interface. In Proceedings of the
2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 2779–2784.
36. Nicolas-Alonso, L.F.; Gomez-Gil, J. Brain computer interfaces, a review. Sensors 2012, 12, 1211–1279. doi:10.3390/s120201211.
37. Hochberg, L.; Donoghue, J. Sensors for brain-computer interfaces. IEEE Eng. Med. Biol. Mag. 2006, 25, 32–38.
doi:10.1109/MEMB.2006.1705745.
38. Birbaumer, N.; Murguialday, A.R.; Cohen, L. Brain-computer interface in paralysis. Curr. Opin. Neurol. 2008, 21, 634–638.
doi:10.1097/WCO.0b013e328315ee2d.
39. Sitaram, R.; Weiskopf, N.; Caria, A.; Veit, R.; Erb, M.; Birbaumer, N. fMRI brain-computer interfaces. IEEE Signal Process. Mag.
2008, 25, 95–106. doi:10.1109/MSP.2008.4408446.
40. Leuthardt, E.C.; Miller, K.J.; Schalk, G.; Rao, R.P.N.; Ojemann, J.G. Electrocorticography-based brain computer interface - The
Seattle experience. IEEE Trans. Neural Syst. Rehabil. Eng. 2006, 14, 194–198. doi:10.1109/TNSRE.2006.875536.
41. Singh, S.P. Magnetoencephalography: Basic principles. Annals Indian Acad. Neurol. 2014, 17 (Suppl. 1), S107–S112.
doi:10.4103/0972-2327.128676.
42. Mandal, P.K.; Banerjee, A.; Tripathi, M.; Sharma, A. A Comprehensive Review of Magnetoencephalography (MEG)
Studies for Brain Functionality in Healthy Aging and Alzheimer’s Disease (AD). Front. Comput. Neurosci. 2018, 12, 60.
doi:10.3389/fncom.2018.00060.
43. Keppler, J.S.; Conti, P.S. A Cost Analysis of Positron Emission Tomography. Am. J. Roentgenol. 2001, 177, 31–40.
44. Vaquero, J.J.; Kinahan, P. Positron Emission Tomography: Current Challenges and Opportunities for Technological Advances in
Clinical and Preclinical Imaging Systems. Annu. Rev. Biomed. Eng. 2015, 17, 385–414. doi:10.1146/annurev-bioeng-071114-040723.
45. Queiroz, M.A.; de Galiza Barbosa, F.; Buchpiguel, C.A.; Cerri, G.G. Positron emission tomography/magnetic resonance imaging
(PET/MRI): An update and initial experience at HC-FMUSP. Rev. Assoc. MéDica Bras. 2018, 64, 71–84. doi:10.1590/1806-
9282.64.01.71.
46. Lu, W.; Dong, K.; Cui, D.; Jiao, Q.; Qiu, J. Quality assurance of human functional magnetic resonance imaging: A literature
review. Quant. Imaging Med. Surg. 2019, 9, 1147.
47. Bielczyk, N.Z.; Uithol, S.; van Mourik, T.; Anderson, P.; Glennon, J.C.; Buitelaar, J.K. Disentangling causal webs in the brain using
functional magnetic resonance imaging: A review of current approaches. Netw. Neurosci. 2019, 3, 237–273.
48. Silva, A.; Merkle, H. Hardware considerations for functional magnetic resonance imaging. Concepts Magn. Reson. Educ. J. 2003,
16, 35–49.
Electronics 2021, 10, 560 35 of 43
49. Shokoueinejad, M.; Park, D.W.; Jung, Y.; Brodnick, S.; Novello, J.; Dingle, A.; Swanson, K.; Baek, D.H.; Suminski, A.; Lake, W.; et al.
Progress in the field of micro-electrocorticography. Micromachines 2019, 10, 62. doi:10.3390/mi10010062.
50. Kumar Mudgal, S.; Sharma, S.; Chaturvedi, J.; Sharma, A. Brain computer interface advancement in neurosciences: Applications
and issues. Interdiscip. Neurosurg. 2020, 20, 100694. doi:10.1016/j.inat.2020.100694.
51. Spychala, N.; Debener, S.; Bongartz, E.; Müller, H.; Thorne, J.; Philipsen, A.; Braun, N. Exploring Self-Paced Embodiable
Neurofeedback for Post-stroke Motor Rehabilitation. Front. Hum. Neurosci. 2020, 13. doi:10.3389/fnhum.2019.00461.
52. Pascual-Leone, A.; Amedi, A.; Fregni, F.; Merabet, L.B. The plastic human brain cortex. Annu. Rev. Neurosci. 2005, 28, 377–401.
53. Friehs, G.M.; Zerris, V.A.; Ojakangas, C.L.; Fellows, M.R.; Donoghue, J.P. Brain-machine and brain-computer interfaces. Stroke
2004, 35, 2702–2705. doi:10.1161/01.STR.0000143235.93497.03.
54. Lee, S.; Fallegger, F.; Casse, B.; Fried, S. Implantable microcoils for intracortical magnetic stimulation. Sci. Adv. 2016, 2.
doi:10.1126/sciadv.1600889.
55. Koch Fager, S.; Fried-Oken, M.; Jakobs, T.; Beukelman, D. New and emerging access technologies for adults with complex
communication needs and severe motor impairments: State of the science. Augment. Altern. Commun. 2019, 35, 13–25.
doi:10.1080/07434618.2018.1556730.
56. Bočková, M.; Rektor, I. Impairment of brain functions in Parkinson’s disease reflected by alterations in neural connectivity in
EEG studies: A viewpoint. Clin. Neurophysiol. 2019, 130, 239–247. doi:10.1016/j.clinph.2018.11.013.
57. Borhani, S.; Abiri, R.; Jiang, Y.; Berger, T.; Zhao, X. Brain connectivity evaluation during selective attention using EEG-based
brain-computer interface. Brain-Comput. Interfaces 2019, 6, 25–35. doi:10.1080/2326263X.2019.1651186.
58. Borton, D.A.; Yin, M.; Aceros, J.; Nurmikko, A. An implantable wireless neural interface for recording cortical circuit dynamics in
moving primates. J. Neural Eng. 2013, 10, 026010. doi:10.1088/1741-2560/10/2/026010.
59. Dethier, J.; Nuyujukian, P.; Ryu, S.; Shenoy, K.; Boahen, K. Design and validation of a real-time spiking-neural-network decoder
for brain-machine interfaces. J. Neural Eng. 2013, 10, 036008. doi:10.1088/1741-2560/10/3/036008.
60. Benabid, A.; Costecalde, T.; Eliseyev, A.; Charvet, G.; Verney, A.; Karakas, S.; Foerster, M.; Lambert, A.; Morinière, B.;
Abroug, N.; et al. An exoskeleton controlled by an epidural wireless brain–machine interface in a tetraplegic patient: A
proof-of-concept demonstration. Lancet Neurol. 2019, 18, 1112–1122.
61. Al Ajrawi, S.; Al-Hussaibi, W.; Rao, R.; Sarkar, M. Efficient balance technique for brain-computer interface applications based on
I/Q down converter and time interleaved ADCs. Inform. Med. Unlocked 2020, 18, 100276.
62. Wahlstrom, K.; Fairweather, B.; Ashman, H. Privacy and brain-computer interfaces: method and interim findings. ORBIT J. 2017,
1, 1–19. doi:10.29297/orbit.v1i2.39.
63. Daly, J.J.; Wolpaw, J.R. Brain-computer interfaces in neurological rehabilitation. Lancet Neurol. 2008, 7, 1032–1043.
doi:10.1016/S1474-4422(08)70223-0.
64. Thomas, K.P.; Guan, C.; Lau, C.T.; Vinod, A.P.; Ang, K.K. A new discriminative common spatial pattern method for motor
imagery brain-computer interfaces. IEEE Trans. Biomed. Eng. 2009, 56, 2730–2733. doi:10.1109/TBME.2009.2026181.
65. Lenhardt, A.; Kaper, M.; Ritter, H.J. An adaptive P300-based online brain-computer interface. IEEE Trans. Neural Syst. Rehabil.
Eng. 2008, 16, 121–130. doi:10.1109/TNSRE.2007.912816.
66. Xia, B.; Li, X.; Xie, H.; Yang, W.; Li, J.; He, L. Asynchronous brain-computer interface based on steady-state visual-evoked
potential. Cogn. Comput. 2013, 5, 243–251. doi:10.1007/s12559-013-9202-7.
67. Pfurtscheller, G.; Allison, B.Z.; Brunner, C.; Bauernfeind, G.; Escalante, T.S.; Scherer, R.; Zander, T.D.; Putz, G.M.; Neuper, C.;
Birbaumer, N. The Hybrid BCI. Front. Neurosci. 2010, 4, 3. doi:10.3389/fnpro.2010.00003.
68. Kwon, M.; Cho, H.; Won, K.; Ahn, M.; Jun, S. Use of both eyes-open and eyes-closed resting states may yield a more robust
predictor of motor imagery BCI performance. Electronics 2020, 9, 690.
69. Ahn, M.; Cho, H.; Ahn, S.; Jun, S.C. High theta and low alpha powers may be indicative of BCI-illiteracy in motor imagery.
PLoS ONE 2013, 8, e80886. doi:10.1371/journal.pone.0080886.
70. Kaiser, V.; Kreilinger, A.; Müller-Putz, G.R.; Neuper, C. First steps toward a motor imagery based stroke BCI: New strategy to set
up a classifier. Front. Neurosci. 2011, 5, 86. doi:10.3389/fnins.2011.00086.
71. Riccio, A.; Simione, L.; Schettini, F.; Pizzimenti, A.; Inghilleri, M.; Belardinelli, M.O.; Mattia, D.; Cincotti, F. Attention
and P300-based BCI performance in people with amyotrophic lateral sclerosis. Front. Hum. Neurosci. 2013, 7, 732.
doi:10.3389/fnhum.2013.00732.
72. Brouwer, A.M.; van Erp, J.B.F. A tactile P300 brain-computer interface. Front. Neurosci.e 2010, 4, 19. doi:10.3389/fnins.2010.00019.
73. Cai, Z.; Makino, S.; Rutkowski, T.M. Brain evoked potential latencies optimization for spatial auditory brain-computer interface.
Cogn. Comput. 2015, 7, 34–43. doi:10.1007/s12559-013-9228-x.
74. Martišius, I.; Damasevicius, R. A prototype SSVEP based real time BCI gaming system. Comput. Intell. Neurosci. 2016,
2016, 3861425. doi:10.1155/2016/3861425.
75. Pfurtscheller, G.; Neuper, C. Motor Imagery and Direct Brain-Computer Communication. Proc. IEEE 2001, 89, 1123–1134.
doi:10.1109/5.939829.
76. Aloise, F.; Aricò, P.; Schettini, F.; Salinari, S.; Mattia, D.; Cincotti, F. Asynchronous gaze-independent event-related potential-based
brain-computer interface. Artif. Intell. Med. 2013, 59, 61–69. doi:10.1016/j.artmed.2013.07.006.
Electronics 2021, 10, 560 36 of 43
77. Wolpaw, J.R.; Birbaumer, N.; Heetderks, W.J.; McFarland, D.J.; Peckham, H.P.; Gerwin, S.; Emanuel, D.; Quatrano, L.A.; Robinson,
C.J.; Vaughan, T.M. Brain-Computer Interface Technology: A Review of the First International Meeting. IEEE Trans. Rehabil. Eng.
2000, 8, 222–226.
78. Luzheng, B.; Xin-An, F.; Yili, L. EEG-Based Brain-Controlled Mobile Robots: A Survey. IEEE Trans. Hum. Mach. Syst. 2013,
43, 161–176. doi:10.1109/TSMCC.2012.2219046.
79. Han, C.H.; Müller, K.R.; Hwang, H.J. Brain-switches for asynchronous brain–computer interfaces: A systematic review. Electronics
2020, 9, 422. doi:10.3390/electronics9030422.
80. Suefusa, K.; Tanaka, T. Asynchronous brain-computer interfacing based on mixed-coded visual stimuli. IEEE Trans. Biomed. Eng.
2018, 65, 2119–2129. doi:10.1109/TBME.2017.2785412.
81. Bin, G.; Gao, X.; Wang, Y.; Li, Y.; Hong, B.; Gao, S. A high-speed BCI based on code modulation VEP. J. Neural Eng. 2011, 8, 025015.
doi:10.1088/1741-2560/8/2/025015.
82. Huan, N.J.; Palaniappan, R. Neural network classification of autoregressive features from electroencephalogram signals for
brain-computer interface design. J. Neural Eng. 2004, 1, 142–150. doi:10.1088/1741-2560/1/3/003.
83. Butkevičiūtė, E.; Bikulciene, L.; Sidekerskiene, T.; Blazauskas, T.; Maskeliunas, R.; Damasevicius, R.; Wei, W. Removal of
movement artefact for mobile EEG analysis in sports exercises. IEEE Access 2019, 7, 7206–7217. doi:10.1109/ACCESS.2018.2890335.
84. Di Flumeri, G.; Aricò, P.; Borghini, G.; Colosimo, A.; Babiloni, F. A new regression-based method for the eye blinks artifacts
correction in the EEG signal, without using any EOG channel. In Proceedings of the 2016 38th Annual International Conference of
the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 6–20 August 2016; Volume 59, pp. 3187–3190.
doi:10.1109/EMBC.2016.7591406.
85. Jirayucharoensak, S.; Israsena, P. Automatic removal of EEG artifacts using ICA and Lifting Wavelet Transform. In Proceedings
of the 2013 International Computer Science and Engineering Conference (ICSEC), Nakhonpathom, Thailand, 4–6 September 2013;
pp. 136–139. doi:10.1109/ICSEC.2013.6694767.
86. Zhang, X.; Vialatte, F.B.; Chen, C.; Rathi, A.; Dreyfus, G. Embedded implementation of second-order blind identification (SOBI)
for real-time applications in neuroscience. Cogn. Comput. 2014, 7, 56-63. doi:10.1007/s12559-014-9282-z.
87. Argunşah, A.; Çetin, M. A brain-computer interface algorithm based on Hidden Markov models and dimensionality reduction.
In Proceedings of the 2010 IEEE 18th Signal Processing and Communications Applications Conference, Diyarbakir, Turkey,
22–24 April 2010; pp. 93–96.
88. Li, J.; Chen, X.; Li, Z. Spike detection and spike sorting with a hidden Markov model improves offline decoding of motor cortical
recordings. J. Neural Eng. 2019, 16. doi:10.1088/1741-2552/aaeaae.
89. Duan, L.; Zhong, H.; Miao, J.; Yang, Z.; Ma, W.; Zhang, X. A voting optimized strategy based on ELM for improving classification
of motor imagery BCI data. Cogn. Comput. 2014, 6, 477–483. doi:10.1007/s12559-014-9264-1.
90. Banitalebi, A.; Setarehdan, S.K.; Hossein-Zadeh, G.A. A technique based on chaos for brain computer interfacing.
In Proceedings of the 14th International CSI Computer Conference, Tehran, Iran, 20–21 October 2009; pp. 464–469.
doi:10.1109/CSICC.2009.5349623.
91. Thulasidas, M.and Guan, C.; Wu, J. Robust classification of EEG signal for brain-computer interface. IEEE Trans. Neural Syst.
Rehabil. Eng. 2006, 14, 24–29. doi:10.1109/TNSRE.2005.862695.
92. van Gerven, M.; Farquhar, J.; Schaefer, R.; Vlek, R.; Geuze, J.; Nijholt, A.; Ramsey, N.; Haselager, P.; Vuurpijl, L.; Gielen, S.; et al.
The brain-computer interface cycle. J. Neural Eng. 2009, 6, 041001. doi:10.1088/1741-2560/6/4/041001.
93. Ryu, S.; Shenoy, K. Human cortical prostheses: Lost in translation? Neurosurgical Focus FOC 2009, 27, E5.
94. Bloch, E.; Luo, Y.; da Cruz, L. Advances in retinal prosthesis systems. Ther. Adv. Ophthalmol. 2019, 11. doi:10.1177/2515841418817501.
95. Zeng, F.G. Trends in cochlear implants. Trends Amplif. 2004, 8, 1–34. doi:10.1177/108471380400800102.
96. Rothschild, R.M. Neuroengineering tools/applications for bidirectional interfaces, brain–computer interfaces, and neuroprosthetic
implants—A review of recent progress. Front. Neuroeng. 2010, 3, 112. doi:10.3389/fneng.2010.00112.
97. Krishna, N.; Sekaran, K.; Annepu, V.; Vamsi, N.; Ghantasala, G.; Pethakamsetty, C.; Kadry, S.; Blazauskas, T.; Damasevicius, R.
An Efficient Mixture Model Approach in Brain-Machine Interface Systems for Extracting the Psychological Status of Mentally
Impaired Persons Using EEG Signals. IEEE Access 2019, 7, 77905–77914. doi:10.1109/ACCESS.2019.2922047.
98. Damasevicius, R.; Maskeliunas, R.; Kazanavicius, E.; Woźniak, M. Combining Cryptography with EEG Biometrics. Comput. Intell.
Neurosci. 2018, 2018, 1867548. doi:10.1155/2018/1867548.
99. Al-Saggaf, A. Secure method for combining cryptography with iris biometrics. J. Univers. Comput. Sci. 2018, 24, 341–356.
100. Zhang, C.; Kimura, Y.; Higashi, H.; Tanaka, T. A simple platform of brain-controlled mobile robot and its implementation
by SSVEP. In Proceedings of the 2012 International Joint Conference on Neural Networks (IJCNN), Brisbane, QLD, Australia,
10–15 June 2012; pp. 1–7.
101. Edelman, B.; Meng, J.; Suma, D.; Zurn, C.; Nagarajan, E.; Baxter, B.; Cline, C.; He, B. Noninvasive neuroimaging enhances
continuous neural tracking for robotic device control. Sci. Robots 2019, 4. doi:10.1126/scirobotics.aaw6844.
102. Bockbrader, M. Upper limb sensorimotor restoration through brain–computer interface technology in tetraparesis. Curr. Opin.
Biomed. Eng. 2019, 11, 85–101.
103. Lécuyer, A.; Lotte, F.; Reilly, R.B.; Leeb, R.; Hirose, M.; Slater, M. Brain–computer interfaces, virtual reality, and videogames.
Computer 2008, 41, 66–72. doi:10.1109/MC.2008.410.
Electronics 2021, 10, 560 37 of 43
104. Yuan, P.; Wang, Y.; Gao, X.; Jung, T.P.; Gao, S. A Collaborative Brain-Computer Interface for Accelerating Human Decision Making.
In Universal Access in Human-Computer Interaction. Design Methods, Tools, and Interaction Techniques for eInclusion; Lecture Notes in
Computer Science; Stephanidis, C., Antona, M., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; Volume 8009, pp. 672–681.
doi:10.1007/978-3-642-39188-0_72.
105. Bianchi, L.; Gambardella, F.; Liti, C.; Piccialli, V. Group study via collaborative BCI. In Proceedings of the 2019 IEEE International
Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 272–276. doi:10.1109/SMC.2019.8914482.
106. Kapeller, C.; Ortner, R.; Krausz, G.; Bruckner, M.; Allison, B.; Guger, C.; Edlinger, G. Toward Multi-brain Communication:
Collaborative Spelling with a P300 BCI. In Foundations of Augmented Cognition. Advancing Human Performance and Decision-Making
through Adaptive Systems, Proceedings of the AC 2014, Heraklion, Crete, Greece, 22–27 June 2014; Lecture Notes in Computer Science;
Schmorrow D.D., Fidopiastis C.M., Eds.; Springer: Cham, Switzerland, 2014; Volume 8534, pp. 47–54. doi:10.1007/978-3-319-
07527-3_5.
107. Papanastasiou, G.; Drigas, A.; Skianis, C.; Lytras, M. Brain computer interface based applications for training and rehabilitation
of students with neurodevelopmental disorders. A literature review. Heliyon 2020, 6, e04250. doi:10.1016/j.heliyon.2020.e04250.
108. Aricò, P.; Borghini, G.; Di Flumeri, G.; Sciaraffa, N.; Colosimo, A.; Babiloni, F. Passive BCI in operational environments: Insights,
recent advances, and future trends. IEEE Trans. Biomed. Eng. 2017, 64, 1431–1436. doi:10.1088/1741-2560/11/3/035004.
109. Aricò, P.; Borghini, G.; Di Flumeri, G.; Bonelli, S.; Golfetti, A.; Graziani, I. Human Factors and Neurophysiological Metrics in Air
Traffic Control: a Critical Review. IEEE Rev. Biomed. Eng. 2017, 13. doi:10.1109/rbme.2017.2694142.
110. Burwell, S.; Sample, M.; Racine, E. Ethical aspects of brain computer interfaces: A scoping review. BMC Med. Ethics 2017, 18, 60.
doi:10.1186/s12910-017-0220-y.
111. Klein, E.; Peters, B.; Higger, M. Ethical considerations in ending exploratory brain-computer interface research studies in locked-in
syndrome. Camb. Q. Healthc. Ethics 2018, 27, 660–674. doi:10.1017/S0963180118000154.
112. Vlek, R.J.; Steines, D.; Szibbo, D.; Kübler, A.; Schneider, M.J.; Haselager, P.; Nijboer, F. Ethical issues in brain-computer interface
research, development, and dissemination. J. Neurol. Phys. Therapy 2012, 36, 94–99. doi:10.1097/NPT.0b013e31825064cc.
113. Schermer, M. Ethical issues in deep brain stimulation. Front. Integr. Neurosci. 2011, 5, 17. doi:10.3389/fnint.2011.00017.
114. Landau, O.; Cohen, A.; Gordon, S.; Nissim, N. Mind your privacy: Privacy leakage through BCI applications using machine
learning methods. Knowl. Based Syst. 2020, 198, 105932. doi:10.1016/j.knosys.2020.105932.
115. Ajrawi, S.; Rao, R.; Sarkar, M. Cybersecurity in brain-computer interfaces: RFID-based design-theoretical framework. Inform.
Med. Unlocked 2021, 22, 100489. doi:10.1016/j.imu.2020.100489.
116. Li, Q.; Ding, D.; Conti, M. Brain-Computer Interface applications: Security and privacy challenges. In Proceedings of the
2015 IEEE Conference on Communications and Network Security (CNS), Florence, Italy, 28–30 September 2015; pp. 663–666.
doi:10.1109/CNS.2015.7346884.
117. Mowla, M.; Cano, R.; Dhuyvetter, K.; Thompson, D. Affective brain-computer interfaces: Choosing a meaningful performance
measuring metric. Comput. Biol. Med. 2020, 126, 104001. doi:10.1016/j.compbiomed.2020.104001.
118. Gubert, P.; Costa, M.; Silva, C.; Trofino-Neto, A. The performance impact of data augmentation in CSP-based motor-imagery
systems for BCI applications. Biomed. Signal Process. Control 2020, 62, 102152. doi:10.1016/j.bspc.2020.102152.
119. North, S.; Young, K.; Hamilton, M.; Kim, J.; Zhao, X.; North, M.; Cronnon, E. Brain-Computer Interface: An Experimental
Analysis of Performance Measurement. In Proceedings of the IEEE SoutheastCon 2020, Raleigh, NC, USA, 28–29 March 2020;
pp. 1–8. doi:10.1109/SoutheastCon44009.2020.9249709.
120. Myers Briggs, I.; McCaulley, M.; Quenk, N.; A, H. MBTI Handbook: A Guide to the Development and Use of the Myers-Briggs Type
Indicator, 3rd ed.; Consulting Psychologists Press: Sunnyvale, CA, USA, 1998.
121. Yadav, D.; Yadav, S.; Veer, K. A comprehensive assessment of brain computer interfaces: Recent trends and challenges. J. Neurosci.
Methods 2020, 346, 108918. doi:10.1016/j.jneumeth.2020.108918.
122. Kögel, J.; Jox, R.; Friedrich, O. What is it like to use a BCI?—Insights from an interview study with brain-computer interface
users. BMC Med. Ethics 2020, 21, 2. doi:10.1186/s12910-019-0442-2.
123. Wolpaw, J.R.; McFarland, D.J. Control of a two-dimensional movement signal by a noninvasive brain-computer interface in
humans. Natl. Acad. Sci. 2004, 101, 17849–17854.
124. McFarland, D.J.; Wolpaw, J.R. Brain-computer interfaces for communication and control. Commun. ACM 2011, 54, 60–66.
doi:10.1145/1941487.1941506.
125. Di Flumeri, G.; Aricò, P.; Borghini, G.; Sciaraffa, N.; Di Florio, A.; Babiloni, F. The Dry Revolution: Evaluation of Three Different
EEG Dry Electrode Types in Terms of Signal Spectral Features, Mental States Classification and Usability. Sensors 2019, 19, 1365.
doi:10.3390/s19061365.
126. Malmivuo, J.; Plonsey, R. Bioelectromagnetism: Principles and Applications of Bioelectric and Biomagnetic Fields; Oxford University
Press: Oxford, UK, 1995. doi:10.1021/jm100851r.
127. Nuwer, M.R. Recording electrode site nomenclature. J. Clin. Neurophysiol. 1987, 4, 121.
128. Suarez, E.; Viegas, M.D.; Adjouadi, M.; Barreto, A. Relating induced changes in EEG signals to orientation of visual stimuli using
the ESI-256 machine. Biomed. Sci. Instrum. 2000, 36, 33.
129. Oostenveld, R.; Praamstra, P. The five percent electrode system for high-resolution EEG and ERP measurements. Clin.
Neurophysiol. 2001, 112, 713–719. doi:10.1016/S1388-2457(00)00527-7.
Electronics 2021, 10, 560 38 of 43
130. Jurcak, V.; Tsuzuki, D.; Dan, I. 10/20, 10/10, and 10/5 systems revisited: their validity as relative head-surface-based positioning
systems. NeuroImage 2007, 34, 1600–1611.
131. Crosson, B.; Ford, A.; McGregor, K.M.; Meinzer, M.; Cheshkov, S.; Li, X.; Walker-Batson, D.; Briggs, R.W. Functional imaging and
related techniques: An introduction for rehabilitation researchers. J. Rehabil. Res. Dev. 2010, 47, vii–xxxiv.
132. Dornhege, G.; Millan, J.d.R.; Hinterberger, T.; McFarland, D.; Muller, K.R. (Eds.) Toward Brain-Computer Interfacing; A Bradford
Book; The MIT Press: Cambridge: MA, USA, 2007.
133. Luck, S.J. An Introduction to the Event-Related Potential Technique; The MIT Press: Cambridge: MA, USA, 2014.
134. Picton, T.W. The P300 wave of the human event-related potential. J. Clin. Neurophysiol. 1992, 9, 456–479.
135. Sellers, E.W.; Donchin, E. A P300-based brain–computer interface: initial tests by ALS patients. Clin. Neurophysiol. 2006,
117, 538–548.
136. Blankertz, B.; Dornhege, G.; Krauledat, M.; Schröder, M.; Williamson, J.; Murray, S.R.; Müller, K.R. The Berlin Brain-Computer
Interface presents the novel mental typewriter Hex-o-Spell. Handb. Clin. Neurol. 2006, 113, 108–109. doi:10.1.1.68.2380.
137. Blankertz, B.; Krauledat, M.; Dornhege, G.; Williamson, J.; Murray, S.R.; Müller, K.R. A Note on Brain Actuated Spelling with the
Berlin Brain-Computer Interface. Handb. Clin. Neurol. 2007, 4555, 759–768. doi:10.1007/978-3-540-73281-5_83.
138. Treder, M.S.; Blankertz, B. Covert attention and visual speller design in an ERP-based brain-computer interface. Behav. Brain
Funct. 2010, 6. doi:10.1186/1744-9081-6-28.
139. Treder, M.S.; Schmidt, N.M.; Blankertz, B. Gaze-independent brain-computer interfaces based on covert attention and feature
attention. J. Neural Eng. 2011, 8, 066003. doi:10.1088/1741-2560/8/6/066003.
140. Schreuder, M.; Blankertz, B.; Tangermann, M. A New Auditory Multi-Class Brain-Computer Interface Paradigm: Spatial Hearing
as an Informative Cue. PLoS ONE 2010, 5, e9813. doi:10.1371/journal.pone.0009813.
141. Rader, S.; Holmes, J.; Golob, E. Auditory event-related potentials during a spatial working memory task. Clin. Neurophysiol. 2008,
119, 1176–1189.
142. Solon, A.; Lawhern, V.; Touryan, J.; McDaniel, J.; Ries, A.; Gordon, S. Decoding P300 variability using convolutional neural
networks. Front. Hum. Neurosci. 2019, 13, 201. doi:10.3389/fnhum.2019.00201.
143. Fernández-Rodríguez, A.; Medina-Juliá, M.; Velasco-Álvarez, F.; Ron-Angevin, R. Effects of spatial stimulus overlap in a visual
P300-based brain-computer interface. Neuroscience 2020, 431, 134–142. doi:10.1016/j.neuroscience.2020.02.011.
144. Regan, D. Steady-state evoked potentials. J. Opt. Soc. Am. 1977, 67, 1475–1489.
145. Gao, X.; Xu, D.; Cheng, M.; Gao, S. A BCI-based environmental controller for the motion-disabled. IEEE Trans. Neural Syst.
Rehabil. Eng. 2003, 11, 137–140.
146. Middendorf, M.; McMillan, G.; Calhoun, G.; Jones, K.S. Brain-computer interfaces based on the steady-state visual-evoked
response. IEEE Trans. Rehabil. Eng. 2000, 8, 211–214.
147. Fisher, R.S.; Harding, G.; Erba, G.; Barkley, G.L.; Wilkins, A. Photic- and Pattern-induced Seizures: A Review for the Epilepsy
Foundation of America Working Group. Epilepsia 2005, 46, 1426–1441.
148. Zhu, D.; Bieger, J.; Molina, G.G.; Aarts, R.M. A survey of stimulation methods used in SSVEP-based BCIs. Comput. Intell. Neurosci.
2010, 10, 1–12.
149. Vialatte, F.B.; Maurice, M.; Dauwels, J.; Cichocki, A. Steady-state visually evoked potentials: Focus on essential paradigms and
future perspectives. Prog. Neurobiol. 2010, 90, 418–438.
150. Wang, Y.; Gao, X.; Hong, B.; Jia, C.; Gao, S. Brain-computer interfaces based on visual evoked potentials. IEEE Eng. Med. Biol.
Mag. 2008, 27, 64–71.
151. Zhang, Y.; Xu, P.; Cheng, K.; Yao, D. Multivariate synchronization index for frequency recognition of SSVEP-based brain-computer
interface. J. Neurosci. Methods 2014, 221, 32–40. doi:10.1016/j.jneumeth.2013.07.018.
152. Wang, Y.; Chen, X.; Gao, X.; Gao, S. A Benchmark Dataset for SSVEP-Based Brain-Computer Interfaces. IEEE Trans. Neural Syst.
Rehabil. Eng. 2017, 25, 1746–1752. doi:10.1109/TNSRE.2016.2627556.
153. Nakanishi, M.; Wang, Y.; Wang, Y.t.; Jung, T.P. Does frequency resolution affect the classification performance of steady-state
visual evoked potentials? In Proceedings of the 2017 8th International IEEE/EMBS Conference on Neural Engineering (NER),
Shanghai, China, 25–28 May 2017; pp. 341–344. doi:10.1109/NER.2017.8008360.
154. Hotelling, H. Relations between two sets of variates. Biometrika 1936, 28, 321–377.
155. Anderson, T.W. An Introduction to Multivariate Statistical Analysis; Wiley: New York, NY, USA, 1958; Volume 2.
156. Lin, Z.; Zhang, C.; Wu, W.; Gao, X. Frequency recognition based on canonical correlation analysis for SSVEP-based BCIs.
IEEE Trans. Biomed. Eng. 2006, 53, 2610–2614.
157. Chen, X.; Wang, Y.; Gao, S.; Jung, T.P.; Gao, X. Filter bank canonical correlation analysis for implementing a high-speed
SSVEP-based brain-computer interface. J. Neural Eng. 2015, 12, 46008. doi:10.1088/1741-2560/12/4/046008.
158. Rabiul Islam, M.; Khademul Islam Molla, M.; Nakanishi, M.; Tanaka, T. Unsupervised frequency-recognition method of SSVEPs
using a filter bank implementation of binary subband CCA. J. Neural Eng. 2017, 14. doi:10.1088/1741-2552/aa5847.
159. Tanaka, T.; Zhang, C.; Higashi, H. SSVEP frequency detection methods considering background EEG. In Proceedings of the
6th Joint International Conference on Soft Computing and Intelligent Systems (SCIS) and 13th International Symposium on
Advanced Intelligent Systems (ISIS), Kobe, Japan, 20–24 November 2012; pp. 1138–1143.
Electronics 2021, 10, 560 39 of 43
160. Wei, C.S.; Lin, Y.P.; Wang, Y.; Wang, Y.T.; Jung, T.P. Detection of steady-state visual-evoked potential using differential canonical
correlation analysis. In Proceedings of the 6th International IEEE/EMBS Conference on Neural Engineering (NER), San Diego,
CA, USA, 6–8 November 2013; pp. 57–60.
161. Nakanishi, M.; Wang, Y.; Wang, Y.T.; Mitsukura, Y.; Jung, T.P. Enhancing unsupervised canonical correlation analysis-based
frequency detection of SSVEPs by incorporating background EEG. In Proceedings of the 36th Annual International Conference of
the IEEE Engineering in Medicine and Biology Society (EMBC), Chicago, IL, USA, 26–30 August 2014; pp. 3053–3056.
162. Nakanishi, M.; Wang, Y.; Wang, Y.T.; Mitsukura, Y.; Jung, T.P. A high-speed brain speller using steady-state visual evoked
potentials. Int. J. Neural Syst. 2014, 24, 1450019. doi:10.1142/S0129065714500191.
163. Nakanishi, M.; Wang, Y.; Chen, X.; Wang, Y.T.; Gao, X.; Jung, T.P. Enhancing detection of SSVEPs for a high-speed brain speller
using task-related component analysis. IEEE Trans. Biomed. Eng. 2018, 65, 104–112. doi:10.1109/TBME.2017.2694818.
164. Zhang, Y.; Yin, E.; Li, F.; Zhang, Y.; Tanaka, T.; Zhao, Q.; Cui, Y.; Xu, P.; Yao, D.; Guo, D. Two-Stage Frequency Recognition Method
Based on Correlated Component Analysis for SSVEP-Based BCI. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 1314–1323.
doi:10.1109/TNSRE.2018.2848222.
165. Zhang, Y.; Guo, D.; Li, F.; Yin, E.; Zhang, Y.; Li, P.; Zhao, Q.; Tanaka, T.; Yao, D.; Xu, P. Correlated component analysis for
enhancing the performance of SSVEP-based brain-computer interface. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 948–956.
doi:10.1109/TNSRE.2018.2826541.
166. Tanaka, H.; Katura, T.; Sato, H. Task-related component analysis for functional neuroimaging and application to near-infrared
spectroscopy data. NeuroImage 2013, 64, 308–327. doi:10.1016/j.neuroimage.2012.08.044.
167. Suefusa, K.; Tanaka, T. Visually stimulated brain-computer interfaces compete with eye tracking interfaces when using small
targets. In Proceedings of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society
(EMBC), Chicago, IL, USA, 26-30 AugUST 2014; pp. 4005–4008. doi:10.1109/EMBC.2014.6944502.
168. Suefusa, K.; Tanaka, T. A comparison study of visually stimulated brain-computer and eye-tracking interfaces. J. Neural Eng.
2017, 14. doi:10.1088/1741-2552/aa6086.
169. Bakardjian, H.; Tanaka, T.; Cichocki, A. Optimization of SSVEP brain responses with application to eight-command brain-
computer interface. Neurosci. Lett. 2010, 469, 34–38.
170. Kimura, Y.; Tanaka, T.; Higashi, H.; Morikawa, N. SSVEP-based brain–computer interfaces using FSK-modulated visual stimuli.
IEEE Trans. Biomed. Eng. 2013, 60, 2831–2838. doi:10.1109/TBME.2013.2265260.
171. Bin, G.; Gao, X.; Wang, Y.; Hong, B.; Gao, S. VEP-based brain-computer interfaces: Time, frequency, and code modulations.
IEEE Comput. Intell. Mag. 2009, 4, 22–26.
172. Guo, F.; Hong, B.; Gao, X.; Gao, S. A brain–computer interface using motion-onset visual evoked potential. J. Neural Eng. 2008,
5, 477–485.
173. Tanji, Y.; Nakanishi, M.; Suefusa, K.; Tanaka, T. Waveform-Based Multi-Stimulus Coding for Brain-Computer Interfaces Based on
Steady-State Visual Evoked Potentials. In Proceedings of the 2018 IEEE International Conference on Acoustics, Speech and Signal
Processing (ICASSP), Calgary, AB, Canada, 15–20 April 2018; pp. 821–825. doi:10.1109/ICASSP.2018.8462246.
174. Chen, X.; Wang, Y.; Nakanishi, M.; Gao, X.; Jung, T.P.; Gao, S. High-speed spelling with a noninvasive brain-computer interface.
Proc. Natl. Acad. Sci. USA 2015, 112, E6058–E6067. doi:10.1073/pnas.1508080112.
175. Norcia, A.M.; Appelbaum, L.G.; Ales, J.M.; Cottereau, B.R.; Rossion, B. The steady-state visual evoked potential in vision research:
A review. J. Vis. 2015, 15, 4. doi:10.1167/15.6.4.
176. İşcan, Z.; Nikulin, V.V. Steady state visual evoked potential (SSVEP) based brain-computer interface (BCI) performance under
different perturbations. PLoS ONE 2018, 13, e0191673. doi:10.1371/journal.pone.0191673.
177. Muller-Putz, G.R.; Scherer, R.; Neuper, C.; Pfurtscheller, G. Steady-state somatosensory evoked potentials: suitable brain signals
for brain-computer interfaces? IEEE Trans. Neural Syst. Rehabil. Eng. 2006, 14, 30–37.
178. Goodin, D.S.; Squires, K.C.; Starr, A. Long latency event-related components of the auditory evoked potential in dementia. Brain
J. Neurol. 1978, 101, 635–648.
179. Higashi, H.; Rutkowski, T.M.; Washizawa, Y.; Cichocki, A.; Tanaka, T. EEG auditory steady state responses classification for the
novel BCI. In Proceedings of 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society
(EMBC), Boston, MA, USA, 30 August–3 September 2011; pp. 4576–4579.
180. Guillot, A.; Collet, C. The Neurophysiological Foundations of Mental and Motor Imagery; Oxford University Press: Oxford, UK, 2010.
181. Pfurtscheller, G.; Brunner, C.; Schlögl, A.; Lopes da Silva, F. Mu rhythm (de) synchronization and EEG single-trial classification of
different motor imagery tasks. Neuroimage 2006, 31, 153–159.
182. Sanei, S.; Chambers, J. EEG Signal Processing; Wiley-Interscience: Chichester, UK, 2007; Chapter 1.
183. Pfurtscheller, G.; Neuper, C.; Flotzinger, D.; Pregenzer, M. EEG-based discrimination between imagination of right and left hand
movement. Electroencephalogr. Clin. Neurophysiol. 1997, 103, 642–651.
184. Pfurtscheller, G.; Lopes, F.H. Event-related EEG/MEG synchronization and desynchronization: Basic principles. Clin. Neurophys-
iol. 1999, 110, 1842–1857.
185. Dornhege, G.; Blankertz, B.; Curio, G.; Müller, K.R. Boosting bit rates in noninvasive EEG single-trial classifications by feature
combination and multiclass paradigms. IEEE Trans. Biomed. Eng. 2004, 51, 993–1002.
186. Grosse-Wentrup, M.; Buss, M. Multiclass common spatial patterns and information theoretic feature extraction. IEEE Trans.
Biomed. Eng. 2008, 55, 1991–2000.
Electronics 2021, 10, 560 40 of 43
187. Kübler, A.; Nijboer, F.; Mellinger, J.; Vaughan, T.M.; Pawelzik, H.; Schalk, G.; McFarland, D.J.; Birbaumer, N.; Wolpaw, J.R. Patients
with ALS can use sensorimotor rhythms to operate a brain-computer interface. Neurology 2005, 64, 1775–1777.
188. Higashi, H.; Tanaka, T.; Funase, A. Classification of single trial EEG during imagined hand movement by rhythmic component
extraction. In Proceedings of 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society
(EMBC), Minneapolis, MN, USA, 3–6 September 2009; pp. 2482–2485. doi:10.1109/IEMBS.2009.5334806.
189. Pfurtscheller, G.; Muller-Putz, G.R.; Scherer, R.; Neuper, C. Rehabilitation with brain-computer interface systems. Computer 2008,
41, 58–65. doi:10.1109/MC.2008.432.
190. Page, S.J.; Levine, P.; Sisto, S.A.; Johnston, M.V. Mental practice combined with physical practice for upper-limb motor deficit in
subacute stroke. Phys. Therapy 2001, 81, 1455–1462.
191. Sharma, N.; Pomeroy, V.M.; Baron, J.C. Motor imagery: A backdoor to the motor system after stroke? Stroke 2006, 37, 1941–1952.
192. Várkuti, B.; Guan, C.; Pan, Y.; Phua, K.S.; Ang, K.K.; Kuah, C.W.K.; Chua, K.; Ang, B.T.; Birbaumer, N.; Sitaram, R. Resting state
changes in functional connectivity correlate with movement recovery for BCI and robot-assisted upper-extremity training after
stroke. Neurorehabilit. Neural Repair 2013, 27, 53–62. doi:10.1177/1545968312445910.
193. Lambercy, O.; Dovat, L.; Gassert, R.; Burdet, E.; Teo, C.L.; Milner, T. A Haptic Knob for Rehabilitation of Hand Function.
IEEE Trans. Neural Syst. Rehabil. Eng. 2007, 15, 356–366.
194. Alegre, M.; Labarga, A.; Gurtubay, I.G.; Iriarte, J.; Malanda, A.; Artieda, J. Beta electroencephalograph changes during passive
movements: Sensory afferences contribute to beta event-related desynchronization in humans. Neurosci. Lett. 2002, 331, 29–32.
doi:10.1016/S0304-3940(02)00825-X.
195. Cochin, S.; Barthelemy, C.; Roux, S.; Martineau, J. Observation and execution of movement: similarities demonstrated by
quantified electroencephalography. Eur. J. Neurosci. 1999, 11, 1839–1842.
196. Friesen, C.L.; Bardouille, T.; Neyedli, H.F.; Boe, S.G. Combined action observation and motor imagery neurofeedback for
modulation of brain activity. Front. Hum. Neurosci. 2017, 10, 692.
197. Nagai, H.; Tanaka, T. Action Observation of Own Hand Movement Enhances Event-Related Desynchronization. IEEE Trans.
Neural Syst. Rehabil. Eng. 2019, 27, 1407–1415.
198. Müller-Gerking, J.; Pfurtscheller, G.; Flyvbjerg, H. Designing optimal spatial filters for single-trial EEG classification in a
movement task. Clin. Neurophysiol. 1999, 110, 787–798.
199. Ramoser, H.; Muller-Gerking, J.; Pfurtscheller, G. Optimal spatial filtering of single trial EEG during imagined hand movement.
IEEE Trans. Rehabil. Eng. 2000, 8, 441–446.
200. Lemm, S.; Blankertz, B.; Curio, G.; Müller, K.R. Spatio-spectral filters for improving the classification of single trial EEG. IEEE
Trans. Biomed. Eng. 2005, 52, 1541–1548.
201. Dornhege, G.; Blankertz, B.; Krauledat, M.; Losch, F.; Curio, G.; Muller, K.R. Combined optimization of spatial and temporal
filters for improving brain-computer interfacing. IEEE Trans. Biomed. Eng. 2006, 53, 2274–2281.
202. Tomioka, R.; Müller, K.R. A regularized discriminative framework for EEG analysis with application to brain–computer interface.
NeuroImage 2010, 49, 415–432.
203. Wu, W.; Gao, X.; Hong, B.; Gao, S. Classifying single-trial EEG during motor imagery by iterative spatio-spectral patterns learning
(ISSPL). IEEE Trans. Biomed. Eng. 2008, 55, 1733–1743.
204. Ang, K.K.; Chin, Z.Y.; Zhang, H.; Guan, C. Filter bank common spatial pattern (FBCSP) in brain-computer interface. In
Proceedings of the IEEE World Congress on Computational Intelligence—IEEE International Joint Conference on Neural
Networks, Hong Kong, China, 1–8 June 2008; pp. 2390–2397.
205. Higashi, H.; Tanaka, T. Simultaneous design of FIR filter banks and spatial patterns for EEG signal classification. IEEE Trans.
Biomed. Eng. 2013, 60, 1100–1110.
206. Higashi, H.; Tanaka, T. Common spatio-time-frequency patterns for motor imagery-based brain machine interfaces. Comput.
Intell. Neurosci. 2013, 2013, 8.
207. Tomida, N.; Tanaka, T.; Ono, S.; Yamagishi, M.; Higashi, H. Active Data Selection for Motor Imagery EEG Classification.
IEEE Trans. Biomed. Eng. 2014, 62, 458–467. doi:10.1109/TBME.2014.2358536.
208. Barachant, A.; Bonnet, S.; Congedo, M.; Jutten, C. Multiclass brain-computer interface classification by Riemannian geometry.
IEEE Trans. Biomed. Eng. 2012, 59, 920–928.
209. Uehara, T.; Tanaka, T.; Fiori, S. Robust averaging of covariance matrices for motor-imagery brain-computer interfacing by
Riemannian geometry. In Advances in Cognitive Neurodynamics (V): Proceedings of the Fifth International Conference on Cognitive
Neurodynamics-2015); Springer: Singapore, 2016; pp. 347–353. doi:10.1109/IEMBS.2009.5332817.
210. Barachant, A.; Bonnet, S.; Congedo, M.; Jutten, C. Classification of covariance matrices using a Riemannian-based kernel for BCI
applications. Neurocomputing 2013, 112, 172 – 178. doi:10.1016/j.neucom.2012.12.039.
211. Islam, M.R.; Tanaka, T.; Molla, M.K.I. Multiband tangent space mapping and feature selection for classification of EEG during
motor imagery. J. Neural Eng. 2018, 15. doi:10.1088/1741-2552/aac313.
212. Yger, F.; Berar, M.; Lotte, F. Riemannian Approaches in Brain-Computer Interfaces: A Review. IEEE Trans. Neural Syst. Rehabil.
Eng. 2017, 25, 1753–1762. doi:10.1109/TNSRE.2016.2627016.
213. Fiori, S.; Tanaka, T. An algorithm to compute averages on matrix Lie groups. IEEE Trans. Signal Process. 2009, 57, 4734–4743.
214. Fiori, S. Learning the Fréchet mean over the manifold of symmetric positive-definite matrices. Cogn. Comput. 2009, 1, 279–291.
Electronics 2021, 10, 560 41 of 43
215. Uehara, T.; Sartori, M.; Tanaka, T.; Fiori, S. Robust Averaging of Covariances for EEG Recordings Classification in Motor Imagery
Brain-Computer Interfaces. Neural Comput. 2017, 29, 1631–1666. doi:10.1162/NECO_a_00963.
216. Graimann, B.; Pfurtscheller, G.; Allison, B. Brain-Computer Interfaces—Revolutionizing Human-Computer Interaction; Springer:
Berlin/Heidelberg, Germany, 2010.
217. Ladouce, S.; Donaldson, D.; Dudchenko, P.; Ietswaart, M. Understanding Minds in Real-World Environments: Toward a Mobile
Cognition Approach. Front. Hum. Neurosci. 2017, 10. doi:10.3389/fnhum.2016.00694.
218. Abiri, R.; Borhani, S.; Sellers, E.W.; Jiang, Y.; Zhao, X. A comprehensive review of EEG-based brain-computer interface paradigms.
J. Neural Eng. 2019, 16. doi:10.1088/1741-2552/aaf12e.
219. Lazarou, I.; Nikolopoulos, S.; Petrantonakis, P.; Kompatsiaris, I.; Tsolaki, M. EEG-Based Brain–Computer Interfaces for
Communication and Rehabilitation of People with Motor Impairment: A Novel Approach of the 21st Century. Front. Hum.
Neurosci. 2018, 12, 14. doi:10.3389/fnhum.2018.00014.
220. Shih, J.; Krusienski, D.; Wolpaw, J. Brain-computer interfaces in medicine. Mayo Clinic Proc. 2012, 87, 268–279.
doi:10.1016/j.mayocp.2011.12.008.
221. Huang, Q.; Zhang, Z.; Yu, T.; He, S.; Li, Y. An EEG-EOG-Based Hybrid Brain-Computer Interface: Application on Controlling an
Integrated Wheelchair Robotic Arm System. Front. Neurosci. 2019, 13. doi:10.3389/fnins.2019.01243.
222. Yuanqing, L.; Chuanchu, W.; Haihong, Z.; Cuntai, G. An EEG-based BCI system for 2D cursor control. In Proceedings of the IEEE
International Joint Conference on Neural Networks, IJCNN, and IEEE World Congress on Computational Intelligence, Hong
Kong, China, 1–8 June 2008; pp. 2214–2219. doi:10.1109/IJCNN.2008.4634104.
223. Donchin, E.; Spencer, K.M.; Wijesinghe, R. The mental prosthesis: assessing the speed of a P300-based brain-computer interface.
IEEE Trans. Rehabil. Eng. 2000, 8, 174–179. doi:10.1109/86.847808.
224. Hong, B.; Guo, F.; Liu, T.; Gao, X.; Gao, S. N200-speller using motion-onset visual response. Clin. Neurophysioly 2009,
120, 1658–1666.
225. Karim, A.A.; Hinterberger, T.; Richter, J.; Mellinger, J.; Neumann, N.; Flor, H.; Kübler, A.; Birbaumer, N. Neural internet: Web
surfing with brain potentials for the completely paralyzed. Neurorehabilit. Neural Repair 2006, 20, 508–515.
226. Bensch, M.; Karim, A.A.; Mellinger, J.; Hinterberger, T.; Tangermann, M.; Bogdan, M.; Rosenstiel, W.; Nessi Birbaumer, N.
Nessi: An EEG-Controlled Web Browser for Severely Paralyzed Patients. Comput. Intell. Neurosci. 2007, 7, 508–515.
doi:10.1155/2007/71863.
227. Marshall, D.; Coyle, D.; Wilson, S.; Callaghan, M. Games, gameplay, and BCI: The state of the art. IEEE Trans. Comput. Intell. AI
Games 2013, 5, 82–99. doi:10.1109/TCIAIG.2013.2263555.
228. LaFleur, K.; Cassady, K.; Doud, A.; Shades, K.; Rogin, E.; He, B. Quadcopter control in three-dimensional space using a noninvasive
motor imagery-based brain–computer interface. J. Neural Eng. 2013, 10, 046003. doi:10.1088/1741-2560/10/4/046003.
229. Royer, A.; He, B. Goal selection versus process control in a brain-computer interface based on sensorimotor rhythms. J. Neural
Eng. 2009, 6, 016005. doi:1088/1741-2560/6/1/016005.
230. Royer, A.S.; Doud, A.J.; Rose, M.L.; He, B. EEG control of a virtual helicopter in 3-dimensional space using intelligent control
strategies. IEEE Trans. Neural Syst. Rehabil. Eng. 2010, 18, 581–589. doi:10.1109/TNSRE.2010.2077654.
231. Doud, A.J.; Lucas, J.P.; Pisansky, M.T.; He, B. Continuous Three-Dimensional Control of a Virtual Helicopter Using a Motor
Imagery Based Brain-Computer Interface. PLoS ONE 2011, 6, e26322. doi:10.1371/journal.pone.0026322.
232. Chae, Y.; Jeong, J.; Jo, S. Toward Brain-Actuated Humanoid Robots: Asynchronous Direct Control Using an EEG-Based BCI.
IEEE Trans. Robot. 2012, 28, 1131–1144. doi:10.1109/TRO.2012.2201310.
233. Alshabatat, A.; Vial, P.; Premaratne, P.; Tran, L. EEG-based brain-computer interface for automating home appliances. J. Comput.
2014, 9, 2159–2166. doi:10.4304/jcp.9.9.2159-2166.
234. Lin, C.T.; Lin, B.S.; Lin, F.C.; Chang, C.J. Brain Computer Interface-Based Smart Living Environmental Auto-Adjustment Control
System in UPnP Home Networking. IEEE Syst. J. 2014, 8, 363–370. doi:10.1109/JSYST.2012.2192756.
235. Ou, C.Z.; Lin, B.S.; Chang, C.J.; Lin, C.T. Brain Computer Interface-based Smart Environmental Control System. In Proceed-
ings of the Eighth International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP),
Piraeus-Athens, Greece, 18–20 July 2012; pp. 281–284. doi:10.1109/IIH-MSP.2012.74.
236. Edlinger, G.; Holzner, C.; Guger, C.; Groenegress, C.; Slater, M. Brain-computer interfaces for goal orientated control of a virtual
smart home environment. In Proceedings of the 4th International IEEE/EMBS Conference on Neural Engineering, Antalya,
Turkey, 29 April–2 May 2009; pp. 463–465. doi:10.1109/NER.2009.5109333.
237. Kanemura, A.; Morales, Y.; Kawanabe, M.; Morioka, H.; Kallakuri, N.; Ikeda, T.; Miyashita, T.; Hagita, N.; Ishii, S. A
waypoint-based framework in brain-controlled smart home environments: Brain interfaces, domotics, and robotics integra-
tion. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan,
3–7 November 2013; pp. 865–870. doi:10.1109/IROS.2013.6696452.
238. Bayliss, J.; Ballard, D. A virtual reality testbed for brain-computer interface research. IEEE Trans. Rehabil. Eng. 2000, 8, 188–190.
doi:10.1109/86.847811.
239. Bayliss, J. Use of the evoked potential P3 component for control in a virtual apartment. IEEE Trans. Neural Syst. Rehabil.
Engineering 2003, 11, 113–116. doi:10.1109/TNSRE.2003.814438.
240. Ianez, E.; Azorin, J.M.; Ubeda, A.; Ferrandez, J.M.; Fernandez, E. Mental tasks-based brain–robot interface. Robot. Auton. Syst.
2010, 58, 1238–1245. doi:10.1016/j.robot.2010.08.007.
Electronics 2021, 10, 560 42 of 43
268. Lance, B.J.; Kerick, S.E.; Ries, A.J.; Oie, K.S.; McDowell, K. Brain-computer interface technologies in the coming decades.
Proc. IEEE 2012, 100, 1585–1599. doi:10.1109/JPROC.2012.2184830.
269. Grosse-Wentrup, M.; Schölkopf, B.; Hill, J. Causal influence of gamma oscillations on the sensorimotor rhythm. NeuroImage 2011,
56, 837–842. doi:10.1016/j.neuroimage.2010.04.265.