0% found this document useful (0 votes)
32 views44 pages

A Comprehensive Review of EEG-based Brain-Computer Interface Paradigms

Uploaded by

annupriya1295
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views44 pages

A Comprehensive Review of EEG-based Brain-Computer Interface Paradigms

Uploaded by

annupriya1295
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

Journal of Neural Engineering

ACCEPTED MANUSCRIPT

A comprehensive review of EEG-based brain-computer interface


paradigms
To cite this article before publication: Reza Abiri et al 2018 J. Neural Eng. in press https://doi.org/10.1088/1741-2552/aaf12e

Manuscript version: Accepted Manuscript


Accepted Manuscript is “the version of the article accepted for publication including all changes made as a result of the peer review process,
and which may also include the addition to the article by IOP Publishing of a header, an article ID, a cover sheet and/or an ‘Accepted
Manuscript’ watermark, but excluding any other editing, typesetting or other changes made by IOP Publishing and/or its licensors”

This Accepted Manuscript is © 2018 IOP Publishing Ltd.

During the embargo period (the 12 month period from the publication of the Version of Record of this article), the Accepted Manuscript is fully
protected by copyright and cannot be reused or reposted elsewhere.
As the Version of Record of this article is going to be / has been published on a subscription basis, this Accepted Manuscript is available for reuse
under a CC BY-NC-ND 3.0 licence after the 12 month embargo period.

After the embargo period, everyone is permitted to use copy and redistribute this article for non-commercial purposes only, provided that they
adhere to all the terms of the licence https://creativecommons.org/licences/by-nc-nd/3.0

Although reasonable endeavours have been taken to obtain all necessary permissions from third parties to include their copyrighted content
within this article, their full citation and copyright line may not be present in this Accepted Manuscript version. Before using any content from this
article, please refer to the Version of Record on IOPscience once published for full citation and copyright details, as permissions will likely be
required. All third party content is fully copyright protected, unless specifically stated otherwise in the figure caption in the Version of Record.

View the article online for updates and enhancements.

This content was downloaded from IP address 129.175.97.14 on 16/11/2018 at 21:40


Page 1 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3
4 A Comprehensive Review of EEG-based Brain-Computer Interface

pt
5
6
Paradigms
7 Reza Abiri1, 2, a, Soheil Borhani2, b, Eric W Sellers3, c, Yang Jiang4, d, Xiaopeng Zhao2, e, *
8
9

cri
1
10 Department of Neurology, University of California, San Francisco/Berkeley, CA 94158, USA
11
2
12 Department of Mechanical, Aerospace, and Biomedical Engineering, University of Tennessee,
13 Knoxville, TN 37996, USA
14
15 3
16 Department of Psychology, East Tennessee State University, Johnson City, TN 37614, USA

us
17
4
18 Department of Behavioral Science, College of Medicine, University of Kentucky, Lexington,
19 KY 40356, USA
20
21
22 Emails: [email protected]; [email protected], [email protected],
23
24
25
26
27
28
c

an
[email protected], [email protected], [email protected]

*Corresponding author: 313 Perkins Hall, University of Tennessee, Knoxville, TN 37996 USA,
E-mail: [email protected]
dM
29
30
31
Abstract
32 Advances in brain science and computer technology in the past decade have led to exciting
33
34 developments in Brain-Computer Interface (BCI), thereby making BCI a top research area in
35 applied science. The renaissance of BCI opens new methods of neurorehabilitation for physically
36 disabled people (e.g., paralyzed patients and amputees) and patients with brain injuries (e.g.,
37
38 stroke patients). Recent technological advances such as wireless recording, machine learning
pte

39 analysis, and real-time temporal resolution have increased interest in electroencephalographic


40 (EEG) based BCI approaches. Many BCI studies have focused on decoding EEG signals
41
42
associated with whole-body kinematics/kinetics, motor imagery, and various senses. Thus, there
43 is a need to understand the various experimental paradigms used in EEG-based BCI systems.
44 Moreover, given that there are many available options, it is essential to choose the most
45
46
appropriate BCI application to properly manipulate a neuroprosthetic or neurorehabilitation
ce

47 device. The current review evaluates EEG-based BCI paradigms regarding their advantages and
48 disadvantages from a variety of perspectives. For each paradigm, various EEG decoding
49
50
algorithms and classification methods are evaluated. The applications of these paradigms with
51 targeted patients are summarized. Finally, potential problems with EEG-based BCI systems are
52 discussed, and possible solutions are proposed.
Ac

53
54
55
56
57
58 1
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 2 of 43

1
2
3
4
1 Introduction

pt
5 The concept of using brain signals to control prosthetic arms was developed in 1971 [1]. Since
6
7 that time, researchers have been attempting to interpret brain waveforms to establish a more
8 accurate and convenient control over external devices. Later, this research area was termed
9 brain-computer interface (BCI), and its applications spread rapidly [2].

cri
10
11 BCI systems utilize recorded brain activity to communicate between the brain and computers to
12
13
control the environment in a manner that is compatible with the intentions of humans [3]. There
14 are two primary directions in which BCI systems have been applied. The first is studying brain
15 activity to investigate a feedforward pathway used to control the external devices without the aim
16

us
of rehabilitation [4]. The other dominant direction is closed-loop BCI systems during
17
18 neurorehabilitation with the feedback loop playing a vital role in recovering the neural plasticity
19 training or regulating brain activities [4].
20
21 Brain activity can be recorded through various neuroimaging methods [3, 5]. The methods can be
22
23
24
25
26
27
28
an
categorized into two groups: invasive and noninvasive. Electrocorticography (ECoG) and
electroencephalography (EEG) have become the most common invasive and noninvasive
technologies, respectively [3]. ECoG, also known as intracranial EEG, is recorded from the
cortical surface. Other invasive technologies record signals from within the brain using single-
neuron action potentials (single units), multi-unit activity (MUA), local field potentials (LFPs)
dM
29 [6, 7]. The high quality spatial and temporal characteristics of these signals lead to successful
30 decoding of biomechanic parameters [8-12]. These decoding achievements for upper limb
31
32
kinematics using invasive electrodes in monkeys and humans have resulted in accurate control of
33 prosthetic devices in 3D space [13-17]. However, the invasive electrodes have significant
34 drawbacks due to the risk of performing surgery and the gradual degradation of the recorded
35
36
signals. Therefore, noninvasive approaches such as functional magnetic resonance imaging
37 (fMRI), magnetoencephalography (MEG), near-infrared spectroscopy (NIRS), and EEG have
38 become more widespread in human participants.
pte

39
40 Although some noninvasive technologies provide a higher spatial resolution (e.g., fMRI), the
41 EEG has proved to be the most popular method due to direct measures of neural activity,
42
43 inexpensiveness, and portability for clinical use [3]. EEG measures electrical brain activity
44 caused by the flow of electric currents during synaptic excitations of neuronal dendrites,
45 especially in the cortex, but also in the deep brain structures. The electric signals are recorded by
46
placing electrodes on the scalp [3]. EEG signals have been used to control devices such as
ce

47
48 wheelchairs [18] and communication aid systems [19]. During the past decade, EEG methods
49 have also become a promising approach in controlling assistive and rehabilitation devices [20].
50
51
EEG signals could provide a pathway from the brain to various external devices resulting in
52 brain-controlled assistive devices for disabled people and brain-controlled rehabilitation devices
Ac

53 for patients with strokes and other neurological deficits [21-25]. One of the most challenging
54
55
topics in BCI is finding and analyzing the relationships between recorded brain activity and
56 underlying models of the human body, biomechanics, and cognitive processing. As a result,
57
58 2
59
60
Page 3 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 investigation of relationships between EEG signals and upper limb movement, real and
4

pt
5 imaginary, has become a fascinating area of research in recent years [26, 27].
6
7
To implement an EEG-based BCI system for a particular application, a specific protocol and
8 paradigm has to be chosen for all phases of the experiment. First, the subject performs a
9 particular task (e.g., imagery task, visual task) in order to learn how to modulate their brain

cri
10
activity while EEG signals are recorded from the scalp. Using the recorded EEG as training
11
12 data, a neural decoder for the paradigm is generated. Afterward, the subject performs the task
13 again and the neural decoder is used for BCI control.
14
15 Many EEG-based BCI review papers have been published [18, 23, 24, 28-32]; however, there is
16 a lack of review or guidance in comparing EEG-based BCI paradigms. Here we aim to review

us
17
18 the most commonly employed EEG-based BCI paradigms. A guideline on deployed algorithms
19 and classification methods in generating control signals from these paradigms are summarized.
20 Each of these paradigms has their advantages and disadvantages depending on a patient’s
21
22
physical condition and user-friendliness. The current and future potential applications of these
23
24
25
26
27
28
an
paradigms in the manipulation of an external object, rehabilitation, restoration, enhancement, and
entertainment are investigated. Finally, present issues and limitations in EEG-based BCI systems
are examined, and future possibilities for developing new paradigms are discussed.
2 Motor Imagery Paradigms
dM
29 Motor imagery is described as imagining a movement rather than executing a real movement (for
30
31 more detail on motor imagery see [27]). Previous studies have confirmed that imagination
32 activates areas of the brain that are responsible for generating actual movement [33]. The most
33 common motor imagery paradigms reported in literature are sensorimotor rhythms (SMR) and
34
35 imagined body kinematics (IBK). In the following sections, the paradigms are described in
36 detail.
37
38 2.1 Sensorimotor Rhythms (SMR) Paradigms
pte

39
40 2.1.1 Overview
41
42 The sensorimotor rhythms paradigm is one of the most popular motor imagery paradigms (e.g.,
43 [34], [35]). In this paradigm, the imagined movement is defined as the imagination of kinesthetic
44
movements of large body parts such as hands, feet, and tongue, which could result in
45
46 modulations of brain activity [36].
ce

47
48 Imagined movement in sensorimotor rhythm paradigms causes event-related desynchronization
49 (ERD) in mu (8-12 Hz) and beta rhythms (18-26 Hz). In contrast, relaxation results in event-
50 related synchronization (ERS), (for an in-depth review see [37]). The ERD and ERS modulations
51
52 are most prominent in EEG signals acquired from electrode locations C3 and C4 (10/20
Ac

53 international system); these electrode locations are above the sensorimotor cortex. These
54 modulated EEG signals in the aforementioned frequency domains (mu/beta) can be employed to
55
56
control prosthetic devices. Wolpaw et al. [38] controlled a one-dimensional cursor using mu
57
58 3
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 4 of 43

1
2
3 rhythms. Figure 1 shows examples of change in frequency spectra of SMR during imagination of
4

pt
5 hands.
6
7
The main drawback of the SMR paradigm is that the training time for 2D and 3D cursor control
8 can take weeks or months. The training for this system requires subjects to learn how to
9 modulate specific frequency bands of neural activity to move a cursor in different directions to

cri
10
select desired targets.
11
12
13
14
15
16

us
17
18
19
20
21
22
23
24
25
26
27
28
an
dM
29
30
31
32
33
34
35
36
37
38
pte

39
40 Figure 1. An example of a change in frequency spectra for EEG recorded from C3 and C4. The top row
41 (a, b) shows spectral power changes in C3 and C4 electrodes while performing imagined movement of
42 right hand versus left hand. The middle row (c, d) shows spectral power change in C3 and C4 electrodes
43 while performing imagined movement of both hands versus rest. The bottom row (e, f) shows spectral
44
power change in C3 and C4 electrodes for imagined movement of right hand versus rest and left hand
45
46 versus rest, respectively (Figure copied from [39] with permission of a Creative Commons Attribution
ce

47 (CC BY) license).


48
49
50 2.1.2 Analysis & Classification Methods
51
52
SMR paradigms have been employed by many researcher groups. For example, Wolpaw and
Ac

53 McFarland introduced the first two-dimensional cursor control strategy [40]. The subjects’ task
54 was a center-out cursor task, where the cursor was guided to one of eight targets located around
55
the perimeter of a computer monitor. In this work, each dimension of cursor movement was
56
57
58 4
59
60
Page 5 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 controlled by a linear equation in which the independent variable was a weighted combination of
4

pt
5 the amplitudes in a mu or beta rhythm frequency band recorded from the right and left
6 sensorimotor cortices. These changes were generated as the result of right and left-hand
7 imaginary movements.
8
9 Bhattacharyya et al. [41] compared the performance of different classification methods for left/

cri
10
right hand imagery in EEG features. They found that the accuracy of kernelized SVM
11
12 outperforms the other classifiers. Murguialday et al. [42] designed a hierarchical linear
13 classification scheme using the peak mu power band to differentiate between relaxation, left-
14
hand movement, and right-hand movement. For movement prediction of the right hand, left
15
16 hand, tongue, and right foot, Morash et al. [36] showed that time-frequency features could better

us
17 depict the non-stationary nature of EEG SMR. Using a parametric modeling approach, they
18
divided time into bins of 256ms and frequency into bins of 3.9 Hz and applied Naïve Bayesian
19
20 classification. However, parametric classification methods require a priori knowledge of
21 subjects’ EEG pattern that is not always applicable for BCI control. Nonetheless, Chen at al. [43]
22
23
24
25
26
27
28
is shown in Table 1. an
used a three-layer neural network non-parametric approach, and they investigated an adaptive
classifier for controlling an orthotic hand by motor imagery. A summary of previous SMR work
dM
29 Table 1. Previous SMR paradigms. DWT: Discrete wavelet transform, LMS: Least mean square, STFT:
30 short-time Fourier transform, CSP: Common spatial pattern, N/A: Not Applicable
31
Reference Task Feature Classification method
32
[38] Cursor control in 1D Mu rhythm (8-12Hz) N/A
33
34 amplitude from
35 [44] Cursor control in 2D FFT + Mu rhythm amplitude Linear regression
36 (7.5-16Hz)
37
38 [45] Grasping and object DWT over 12-14Hz and 18- LDA
pte

39 manipulation 22Hz
40
41 [40] Cursor control in 2D Mu (8-12Hz) and Beta (18- Linear regression + LMS to
42 26Hz) rhythm amplitude optimize weights
43 [42] Control of a prosthetic hand Peak Mu (8-12Hz) band A logistic regression
44
power (relaxation and motor
45
46 imagery) + a logistic
ce

47 regression (right hand and


48 left hand)
49 [43] Control of a hand orthosis STFT over Mu band (8- 3-layer feedforward NN
50
14Hz) classified three classes (right
51
52 hand, left hand, no
Ac

53 imagination)
54 [46] Control of a rehabilitation Using CSP algorithm to N/A
55 robot select features
56
57
58 5
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 6 of 43

1
2
3 [47] Control of a robotic am Time-frequency power of N/A
4 EEG over the recorded

pt
5
locations on [10.5,13.5] Hz
6
frequency range
7
8 [48] Control of a rehabilitation Time-Frequency power in LDA
9 robot EEG alpha [8,13] Hz, sigma

cri
10 [14,18] Hz and beta [18-30]
11
Hz bands over C3, C4, and
12
Cz
13
14
15
16 2.1.3 Applications and Targeted Patients’ Populations

us
17
18 The SMR paradigm has been one of the most promising paradigms used by patients with
19
tetraplegia, spinal cord injury, and Amyotrophic Lateral Sclerosis (ALS). The paradigm was first
20
21 employed in a one-dimensional computer cursor control task by Wolpaw et al. [38]. A drawback
22 of the method is a relatively lengthy training period of up to several weeks. Wolpaw and
23
24
25
26
27
28
an
McFarland [44] used mu rhythms from four channels across left and right central sulci to move a
cursor in 2D space to targets located in the four corners of a computer monitor. Subsequently,
they used the same paradigm with people who had spinal cord injuries to guide the cursor to
eight different targets on the sides of a monitor by imagining right and left-hand movement [40].
dM
29 Finally, they expanded their work and controlled a cursor to hit targets located in 3-dimensional
30 space [49]. In all of these studies the subjects learned to modulate their SMR based on imagery
31 of large body parts such as hands and legs.
32
33 Applications other than cursor control have also been employed using SMR. Guger et al. [50]
34
35
used SMR to open and close a prosthetic hand with imagined right or left-hand movement.
36 Pfurtscheller et al. [51] employed foot imagery to restore hand grasp in a patient with tetraplegia.
37 Muller-Putz et al. [45] developed an EEG-based SMR system using imagined foot and hand
38
movements to help a paralyzed patient do simple tasks such as grasping a cylinder and moving
pte

39
40 an object by controlling a functional electrical stimulation (FES) device. Sun et al. [52] and Roy
41 et al. [53] used motor imagery to control an artificial upper limb. Murguialday et al. [42] also
42
used an SMR design to open and close a prosthetic hand. In recent years, SMR control signals
43
44 have been applied to control objects such as quadcopters [39], virtual helicopters [54], and
45 robotic manipulators [20, 47, 55]. SMR is also employed in rehabilitation robots [46, 48] and
46 hand orthosis [43]. The paradigm has also been tested with healthy and stroke patients [56-61].
ce

47
48 2.2 Imagined Body Kinematics Paradigms
49
50
51
2.2.1 Overview
52 Efforts to extract motor imagery commands from EEG signals has been progressing for years
Ac

53
54
[49]. However, the time-consuming process of training and model calibration limits the efficacy
55 of BCI utilization for many potential users. Furthermore, the first critique in controlling
56 prosthetics for amputees via SMR is the lack of natural and intuitive control [62]. In other words,
57
58 6
59
60
Page 7 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 SMR lacks the ability of direct extraction of kinematic parameters. Although the technique can
4

pt
5 distinguish motor activities corresponding to large body parts, the decoded motor information
6 does not contain magnitude or direction of kinematics parameters (e.g., position, velocity, or
7 acceleration).
8
9 Imagined body kinematics (IBK) is a motor imagery paradigm that originated from invasive BCI

cri
10
technology [9, 10]. However, noninvasive work has noted that the information for this paradigm
11
12 is extracted from low-frequency SMR signals (less than 2Hz) [34]. IBK is classified as an
13 independent paradigm from SMR because the training protocols and analysis methods are
14
fundamentally different from SMR paradigms. In IBK, the subject is asked to imagine the
15
16 continuous movement of only one body part in multi-dimensional space. The recorded signals

us
17 are then decoded in the time domain. This paradigm is sometimes referred to as a natural
18
imaginary movement. In noninvasive devices, Bradberry et al. [63] investigated 2D cursor
19
20 control with a natural imaginary movement paradigm and analyzed the data in time-domain
21 frequencies of less than 1 Hz. Their subjects were instructed to use the natural imaginary
22
23
24
25
26
27
28
invasive devices [10, 16].
an
movement of the right-hand index finger, thereby reducing training time to a level similar to

In addition to Bradberry et al.’s work in noninvasive EEG technology, Ofner et al. [64] studied
the continuous and natural imaginary movements of the right hand in a 2D plane. They estimated
the imagined continuous velocities from EEG signals. Kim et al. [65] decoded the three-
dM
29
30
dimensional trajectory of imagined right-hand movement in space and also examined the effects
31 of eye movements on linear and nonlinear decoding models. Andres et al. [66] conducted a
32 similar study in 2D space using linear models. Gu et al. [67] decoded two types of imaginary
33
movements of the right wrist at two different speeds and later [68] considered the imagined
34
35 speed of wrist movements in paralyzed ALS patients. Others have studied the imaginary
36 movement of the shoulder, elbow, wrist, and finger [69-71]. Although most of this recent work
37
could be classified as decoding of IBK, their application for BCI is limited and is still under
38
investigation.
pte

39
40
41 2.2.2 Analysis & Classification Methods
42
43 A number of seminal works have suggested that the low-frequency components of EEG signals
44 (< 2Hz) located over motor cortex carry kinematic information [63-65, 67-69, 72, 73]. Although
45
46 many studies have shown kinematic data is present in low frequencies, Gu et al. [67] were the
ce

47 first to use this information for classification. They decoded wrist rotation and extension at fast
48 and slow speeds. They found that discrete imagined movement is encoded in the movement-
49
50 related cortical potential (MRCP). In their study, EEG signals were low-pass filtered at 2 Hz and
51 the negative slope 2s before the movement onset known as Bereitschaftspotential (BP) was
52 examined. The BP has two parts, the NS1 (Negative Slope of early BP) and the NS2 (steeper
Ac

53
54 Negative Slope of late BP). The NS1, NS2, and the mu (8-12 Hz) and beta rhythms (18-26 Hz)
55 constituted the feature space in their study. In another study, Yuan et al. [74] decoded seven
56
57
58 7
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 8 of 43

1
2
3 different hand clenching speeds using spatial-temporal characteristics of alpha (8-12 Hz) and
4

pt
5 beta (18-26 Hz) bands. To translate multiple discrete speeds of hand imagery they developed
6 multiple linear regression models and smoothed the output with a low-pass 1 Hz filter. Although
7 they found a correlation between higher frequency bands and the speed of imagery, they did not
8
9 successfully find movement trajectory information. Bradberry et al. [63] conducted a prominent

cri
10 study on IBK; they were able to extract two-dimensional hand imagery [63] and actual three-
11 dimensional hand movement trajectory [72] using low-frequency EEG signals (<1 Hz). A linear
12
13 decoding model with first-order temporal differences of EEG data was developed, and they
14 successfully modeled continuous cursor velocity, which was correlated with the defined
15 trajectory. They also showed that EEG data from 100ms before movement imagination onset is
16

us
17 correlated with the movement. The linear model was as follows:
18 𝑁 𝐿
19
20
𝑥[𝑡] − 𝑥[𝑡 − 1] = 𝑎𝑥 + ∑ ∑ 𝑏𝑛𝑘𝑥 𝑆𝑛 [𝑡 − 𝑘]
21 𝑛=1 𝑘=0
22
23
24
25
26
27
28
an
The same equation was used for horizontal and vertical velocities. In this equation 𝑥[𝑡] −
𝑥[𝑡 − 1] is cursor velocity along one axis, 𝑁 is the number of EEG channels, 𝐿 is the number of
time lags, 𝑆𝑛 [𝑡 − 𝑘] is the temporal lagged version of EEG at EEG channel n at time lag 𝑘, and 𝑎
and 𝑏 are the weights that result from the linear regression.
dM
Using partial least squares (PLS) regression, Ofner and Müller-Putz [73] were able to reduce
29
30 EEG artifacts And also eliminate highly correlated variables. They were also able to identify
31 relationships between latent predictors and desired response variables. By using different
32 electrode locations and different time lags as latent variables, the algorithm captured the user’s
33
34 source space contribution to the latent variables. Finally, Kim et al. [65] explored a nonlinear
35 decoding model called kernel ridge regression (KRR). They showed that KRR algorithm
36 significantly reduced eye movement contamination, which is common in linear models. Andres
37
38 et al. [66] and Kim et al. [65] examined the role of eye movement in the linear decoding of IBK.
pte

39 By comparing the decoding performance with and without EOG contaminated brain signals, they
40 found that eye movement plays a significant role in IBK tasks. Additionally, in contrast to a
41
42
report published by Korik et al. [75], Kim et al. [65] confirmed that the SMR bands do not
43 contain kinematic parameter information.
44
45 2.2.3 Applications and Targeted Patient Population
46
ce

47 The IBK paradigm is new to noninvasive devices. Thus far, it has been applied to a limited
48
49
number of applications. The reason for this is likely due to the poor decoding of EEG signals
50 [76]. Abiri et al. employed natural imagery movements of one hand to control different gestures
51 of a social robot [77, 78] and to manipulate a robotic arm [79]. Gu et al. [68] employed the
52
imagined speed of wrist movements in paralyzed ALS patients. It was shown that employing
Ac

53
54 natural IBK paradigms can dramatically reduce the training times. A generic model which can be
55
56
57
58 8
59
60
Page 9 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 operated with zero-training is also a promising future development. Abiri et al. [80, 81] used the
4

pt
5 IBK in a zero-training BCI paradigm to control a quadcopter in 2D space.
6
7 3 External Stimulation Paradigms
8
9 Brain activity can be affected by external stimulations such as flicking LEDs and sounds. The

cri
10 altered EEG activity can be collected and decoded to control real or virtual objects or external
11
12 prosthetics. This is the basic principle for external stimulation paradigms. External stimulation
13 can be visual [82, 83], auditory [83, 84], or somatosensory [85]. The following sections discuss
14 the most common external stimulation paradigms employed by BCI researchers.
15
16

us
17 3.1 Visual P300 Paradigms
18
19 3.1.1 Overview
20
21 One of the most popular paradigms in EEG-based BCI systems is visual P300 (for review see
22 [86, 87]). Farwell and Donchin pioneered the use of the visual P300-BCI in 1988 [88] by
23
24
25
26
27
28
an
creating what is now referred to as the P300 Speller. The P300 is one of the most studied event-
related potentials (ERP). An ERP is derived by averaging EEG signals of a specific event type.
The P300 component is elicited in response to infrequently presented events using what is known
as an “oddball paradigm.” The P300 is a positive peak in the ERP ranging from 5 to 10
dM
29 microvolts in size and a latency between 220 to 500ms posterior to the event onset (see Figure
30 2). This ERP is defined as an averaged increase in the amplitude of time series of brain signals
31 which is most significant at midline locations (Pz, Cz, and Fz in the 10/20 international system).
32
33 When inter-stimulus intervals are less than 250-300ms [89], the definition of P300 becomes
34 debatable because the P300 response and the presentation of subsequent stimuli overlap in time.
35 For example, with very short inter-stimulus intervals, like 125ms, 3 to 5 stimuli are delivered in
36
37
the range 0-500ms from the onset of first stimulus. Likely, the P300 elicited in this paradigm is
38 the sum of the P300 and other components that are elicited by other stimuli that are presented
pte

39 prior to and after any given stimulus presentation.


40
41 The most important advantages of the visual P300 BCI are that most subjects can use it with very
42 high accuracy and it can be calibrated in minutes. Therefore, the user can easily and quickly use
43
44 the system to control devices. Disadvantages of this paradigm include fatigue from the high level
45 of attention and visual focus required to use the system [90], and the inability for people with
46 visual impairments to use the system [91].
ce

47
48
49
50
51
52
Ac

53
54
55
56
57
58 9
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 10 of 43

1
2
3
4

pt
5
6
7
8
9

cri
10
11
12
13
14
15
16

us
17
18
19
20
21
22 Figure 2. ERP components after the onset of a visual stimulus (figure copied from [92] with
23
24
25
26
27
28
3.1.2 Analysis & Classification Methods
an
permission) .

A summary of previous studies using P300 is shown in Table 2. The P300 was initially reported
dM
29 by Sutton et al. [93] in 1967. The P300 speller was initially introduced by Farwell and Donchin
30 [88] within a row/column paradigm (RCP) comprised of a 6×6 matrix of letters and numbers.
31
32 Since collecting subject’s overt behavioral response is not necessary for this paradigm, it can be
33 used as a motor-free means of communication for severely disabled patients. Additionally, P300
34 shares very similar inter-subject characteristics which help to diminish the subjects’ training time
35
36 [94]. However, the subject is required to maintain attention throughout the experiment. The P300
37 amplitude is subjective to a number of elements such as the probability of target appearance, the
38 inter-trial duration, difficulty of the experiment, attentional state of the participant and the
pte

39
40 habitual effects [92]. Faster P300 responses are indicative of better cognitive performance in
41 attentional and immediate memory task [92]. Latency jitter can make it difficult to extract the
42 P300 deflection; thus, presenting multiple trials and averaging the EEG response is required to
43
44 increase the signal-to-noise ratio and, thereby, improve decoding accuracy. However, when more
45 trials are presented the rate of communication is slower, which leads to a speed/accuracy trade-
46 off.
ce

47
48 In [88] and [94], the authors addressed the relationship between the number of trials and
49 decoding accuracy using stepwise discriminant analysis (SWDA) and reported that more trials
50
51 significantly improved performance. Piccione et al. [95] extract P300 by using the fuzzy method
52 to combine decomposed components of ICA over EEG. Krusienski et al. [96] compared various
Ac

53 classification techniques including Pearson’s correlation method (PCM); Fisher Linear


54
55 Discriminant (FLD); Stepwise Linear Discriminant Analysis (SWLDA); and, Linear and
56 Nonlinear Support Vector Machines (SVMs). They illustrated that FLD and SWLDA performed
57
58 10
59
60
Page 11 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 significantly better than other classification methods. Moreover, their analysis indicated that the
4

pt
5 P300 was stable across sessions and subjects.
6
7
Citi et al. [97] introduced a 2D cursor control P300-based BCI. They were able to extract an
8 analog control signal with a single-trial approach using a genetic algorithm. Also, there are other
9 single-trial classification approaches using P300 signals [98-101]. Most of the early P300 Speller

cri
10
research had focused on EEG locations along the midline (e.g., Fz, Cz, and Pz). In [102]
11
12 information from posterior locations such as PO7, PO8, and Oz were added to an SWLDA
13 classifier. They showed that adding additional electrode locations significantly improved the
14
discriminability of data samples. Bell et al. [103] increased the information transfer rate (ITR) to
15
16 24 bits/min for a four-choice classification problem relying on the fact that P300 has a robust

us
17 response to multiple trials. They elicited P300-based control analyzing only five trials of P300
18
responses with 95% accuracy using SVMs. Edlinger et al. [104] and Chen et al. [105] applied the
19
20 paradigm in a virtual environment (VE) as an actuator for a smart building scenario and to
21 control a virtual hand, respectively. By dividing the screen into seven different regions Fazel-
22
23
24
25
26
27
28
an
Rezai and Abhari [106] were able to reduce distraction caused by adjacent items and, at the same
time, were able to lower the stimulus probability. These changes resulted in larger P300
amplitudes, which resulted in higher detection accuracy and higher ITR [92].
An innovative checkerboard paradigm (CBP) was introduced in [107]. The CBP showed
significantly higher mean accuracy than the row-column paradigm (RCP) (i.e., 92% compared to
dM
29
30
77%) and mean ITR was increased to 23 bits/min from 17 bits/min. The CBP is able to avoid
31 stimulus adjacency-distraction error addressed in [106] and also increases P300 detection
32 accuracy by lowering the probability of target-occurrence. In [108], a language model to enhance
33
typing speed was utilized. They examined P300 BCI paradigms including single-character
34
35 presentation (SCP), RCP, and they also tested a rapid serial visual presentation (RSVP)
36 paradigm. They applied PCA over a band-pass [1.5-42] Hz filtered EEG to extract a one-
37
dimensional feature vector from multiple locations over frontal, central, and parietal regions.
38
pte

39
40
41 Table 2. A summary of studies with P300 paradigm
42
43 Reference Task Feature Classification method
44 [88] 6×6 row/column (RC) speller Data from Pz, channel were SWLDA
45 extracted, and band-pass
46 filtered [0.02, 35] Hz and
ce

47 downsampled to 50Hz
48 [95] Control a virtual ball in 2D Data from Cz, Pz, Oz, and Fz A three-layer ANN
49 channels were extracted, and
50 ICA was applied to extract
51 features
52 [96, 102] 6×6 row/column (RC) speller Moving average and SWLDA
Ac

53 decimation with factor of 12


54 [97] Computer cursor control in Low-pass filter with cut-off Continuous wavelet
55 2D frequency of 34 Hz and transform (CWT) and
56 decimation to 128 Hz Genetic Algorithm (GA)
57
58 11
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 12 of 43

1
2
3 [103] Control of a humanoid robot Band-pass filter [0.5, 30] Hz SVM
4 and downsampling to 100 Hz

pt
5
6 [104] Single Character (SC) speller Band-pass filter [0.5, 30] Hz LDA
and downsampling to 60 Hz
7
8 [106] Region-based (RB) speller C1, C2, Cz, Pz, and Fz Averaged Mexican-hat
9 channels were used wavelet coefficients used as

cri
10 feature set
11 [107] 8×9 Checkerboard (CB) Cz, Pz, PO7, and PO8 channels SWLDA
12 speller were used
13 [2, 109] Single Character (SC) speller Scaling data samples into [-1, Bayesian Linear
14 1] and downsampling to Discriminant Analysis
15 32Hz (BLDA) and Fisher’s Linear
16 Discriminant Analysis

us
17 (FLDA)
18 [110] Target selection in 3 D space Channel selection and SWLDA
19 downsampling to16 Hz
20
21
22
23
24
25
26
27
28
an
3.1.3 Applications of Visual P300 and Targeted Patient Population
The most common application of visual P300 has been in developing prosthetic keyboards to
provide a pathway of communication for disabled patients. Usually, speller devices in BCI
consist of a matrix of letters, numbers, and symbols [94]. The rows and columns of this matrix
dM
are flashed in sequence, and the subject has to focus attention on the intended character. The
29
30 intended character is then determined by the speller based on its row and column. These devices
31 use a statistical model based on the P300 ERP to identify the correct symbol during flashing. The
32 main advantage of P300 spellers has been their usefulness to people with ALS [92, 111, 112] and
33
34 brainstem stroke [113]. P300 has also been investigated as a way for a subject to control some
35 specific tasks in the environment [114]. It has also been used to control a humanoid robot [103],
36 and to navigate a wheelchair [110, 115]. This paradigm was also employed to control a computer
37
38 cursor in 2D space [97] by paralyzed patients [95]. Additionally, it has been used to control a
pte

39 virtual hand [105] in a virtual reality smart apartment [104].


40
41
42
3.2 Steady State Visual Evoked Potential Paradigms
43
44
3.2.1 Overview
45 The Steady State Visual Evoked Potential (SSVEP) is another popular visual component used in
46
BCI ([116, 117]). SSVEP is also called photic driving since the generators of this response are
ce

47
48 located in visual cortex. Rather than either motor execution or imagined motor action, subjects
49 have to shift gaze and as well as their attention to flickering stimuli, which requires highly
50
51 accurate eye control.
52
In the SSVEP paradigm, a constant frequency flickering stimulus on the central retina results in
Ac

53
54 an EEG pattern consistent with the flickering rate. The frequencies of stimulation can be varied
55 from low (1~3.5 Hz) to high frequency (75~100 Hz) [118]. The stimulus can be produced using
56
57
58 12
59
60
Page 13 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 a light-emitting diode (LED) or a cathode ray tube (CRT). Multiple flickering targets with
4

pt
5 distinct flickering frequencies are typically presented to the user. There is a strong correlation
6 between flicker frequency and the observed frequency of the EEG. The user’s intended target is
7 determined by matching the pattern of EEG activity to the command associated with the
8
9 particular frequency.

cri
10
There are advantages associated with the SSVEP paradigm. Because the stimuli are exogenous,
11
12 it is a no-training paradigm that can be used by many subjects. The stimuli flash at many
13 different frequencies, thereby resulting in many commands and more degrees of freedom to
14
control prosthetic devices. In addition, the SSVEP frequencies can be more reliably classified
15
16 than event-related potentials. However, the use of flickering stimuli could lead to fatigue for the

us
17 subject, mainly when using low flickering frequency [119-122]. This paradigm is also not well
18
suited for people with visual impairments due to the required gaze shifts during use. However,
19
20 Min et al. [123] have recently proposed a new SSVEP paradigm that uses a grid-shaped line
21 array. They suggested that this novel presentation is gaze-independent. There are also steady-
22
23
24
25
26
27
28
[125].
an
state somatosensory evoked potentials (SSSEP) [124] and hybrid SSSEP and P300 applications

3.2.2 SSVEP Analysis & Classification Methods


As opposed to transient VEP which is used to measure the travel time of a visual stimulus from
dM
29 the eye to the occipital cortex [117], SSVEP depicts a stable characteristic of the spectral content
30
31 of EEG signals. Among various EEG paradigms, SSVEP is less vulnerable to artifacts and has
32 higher ITR. BCIs based on P300 or SMR paradigms reach ITR of 4-60 bits/min, SSVEP-based
33 BCIs yield ITR of 100 bits/min and higher. Since information in SSVEP paradigms is located in
34
35 narrow-band frequency ranges, a narrow-band band-pass filter is typically part of the signal
36 preprocessing of SSVEP. However, the amplitude and phase characteristics of SSVEP depend on
37 the intensity and frequency of the stimulus.
38
pte

39 Herrmann [118] investigated the correlation between frequency of stimulus presentation and the
40 firing rates of neurons. The results exhibited resonance phenomena at 40Hz, subharmonics at
41
42 10Hz and 20Hz, and weaker intensity integer multiples of the stimulus (e.g., 80Hz). Muller-Putz
43 and Pfurtscheller [126] applied SSVEP in a hand prosthesis using four-class classification with
44 LED flicker at 6, 7, 8 and 13Hz. Harmonic sums at each of the stimulation frequency yielded the
45
46 feature set for classification of SSVEP. They achieved online accuracy between 44% and 88%. A
ce

47 drawback of the SSVEP paradigm is that low-frequency stimulation can lead to fatigue or
48 epileptic seizure. Therefore, a high-frequency flicker (60-100 Hz) is preferred [127]. Bryan et al.
49
50 [128] used an estimated signal’s power spectrum generated by the Fast Fourier Transform (FFT)
51 as an input to control a humanoid robot with a single electrode (Oz). Li and Zhang [129] applied
52 an LDA classifier and an optimization algorithm to improve SSVEP online accuracy. A
Ac

53
54 minimum energy combination (MEC) was utilized in [130] to detect principle and harmonic
55 frequencies in spatially filtered signals. They also conducted an extensive study including 61
56
57
58 13
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 14 of 43

1
2
3 subjects in order to investigate the scope of applicability of SSVEP-based BCIs. In addition to
4

pt
5 performance, they examined a number of covariates including age, gender, and level of tiredness.
6 Chen et al. [131] examined the correlation coefficients between stimulus frequency and subject’s
7 EEG frequency using canonical correlation analysis (CCA). Considering accuracy and ITR
8
9 simultaneously, they determined a user-specific optimal stimulation duration and phase interval.

cri
10 In a text input application, Chen et al. [132] attempted to enhance ITR by employing entropy
11 coding algorithms such as Huffman coding. An advantage of the SSVEP paradigm is that it is
12
13 less susceptible to motion artifacts. Thus, it is a suitable choice for a mobile subject. Pfurtscheller
14 at al. [133], showed that a gait-assistance exoskeleton could be accurately controlled. They
15 evaluated online and offline performance of CCA and k nearest neighbors (kNN) classifiers.
16

us
17 Most studies conducted with the SSVEP paradigm are based on decoding bottom-up visual
18
information. Thus, these systems are gaze-shift dependent. Min et al. [123] examined a top-down
19
20 visual condition within the paradigm. The results in the top-down condition showed a different
21 pattern over the occipital lobe than the pattern produced by the bottom-up condition. Moreover, a
22
23
24
25
26
27
28
accuracy and ITR is shown in Table 3. an
randomly-shuffled LDA (rLDA) classifier performed more accurately in the top-down condition
than the more commonly used CCA classifier. An overview of previous SSVEP studies with

Bio-inspired intelligent information processing techniques can also help to understand the human
perceptual systems and to incorporate the biological models and features of human perceptual
dM
29
30
systems into the bioinspired information processing techniques to process the physiological
31 signals for BCI. For instance, entropy can be used to measure the dynamic complexity of EEG
32 signals. Cao et al. [134] proposed using inherent fuzzy entropy for the improvement of EEG
33
complexity evaluation, which can apply to SSVEP.
34
35
36
Table 3. An overview of SSVEP paradigms.
37
38 Reference Task Feature Classification method
pte

39
[132] Spelling using a multi-level (6-10) Hz over Oz, A1 and Bayesian model enhanced by
40 selection criterion grounded by A2 language entropy model
41
42 [133] Lower limb exoskeleton (9-17) Hz with eight CCA and kNN
43 electrodes over occipital and
44 parietal lobes referenced by
45 FCz and grounded by Fpz
46 [135] Checkerboard as visual (6-10) Hz over Oz, O1, and Maximum Likelihood
stimuli O2
ce

47
48 [123] Spelling using a grid-shaped (5-10) over 3 electrodes of CCA and rLDA
49 flicking structure occipital lobe
50
[136] Navigation in 2-dimensional 15, 30, 45Hz and their 900 CCA
51
computer game phase shift over occipital and
52 parietal lobes
Ac

53 [131] Spelling characters (7-70) Hz with 9 electrodes CCA


54 over parietal and occipital
55 lobe
56
57
58 14
59
60
Page 15 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [126] Control of an electrical (6-13) Hz with four Maximum Likelihood
4 prosthesis electrodes over occipital lobe

pt
5
6 [129] A brain-to-brain motion (6-13) Hz with four LDA
control system electrodes over occipital lobe
7
8 [130] Spelling (7-10) Hz with eight MEC
9 electrodes over occipital lobe

cri
10
11
12
13 3.2.3 SSVEP Applications and Targeted Patients Population
14 Due to a large number of discrete control commands and high reliability of SSVEP, the paradigm
15
16 has been studied by many BCI researchers. Recently, a high-speed SSVEP speller was used to

us
17 enable the subject to choose among 40 characters including letters of English alphabet, numbers,
18 and some symbols [131]. In addition, an user-dependent SSVEP based on determining the
19
20 prominent key-parameter for each user was developed by [130] to spell only one phrase.
21 According to the appearance frequency of letters, a multilevel SSVEP-based BCI was designed
22
23
24
25
26
27
28
an
in [132] to type text. Bryan et al. [128] used SSVEP signals to control a humanoid robot. Other
applications include an electrical prosthesis [126], an orthosis [137], and a lower limb
exoskeleton [133]. Recently [135] demonstrated the feasibility of an SSVEP paradigm in locked-
in syndrome. SSVEPs have even been used to allow a cockroach to navigate the desired path
[129] and to navigate in a 2-dimensional BCI game [136].
dM
29
30 4 Error-Related Potential
31
32 4.1 Overview
33
34 The error-related potential (ErrP) recently been used as an ERP component that can be used to
35 correct BCI errors [138]. The ErrP occurs when there is a mismatch between a subject’s intention
36
37 to perform a given task and the response provided by the BCI. For example, assume a user
38 wishes to move a cursor from the middle of a monitor to the left side of the monitor. If the cursor
pte

39 erroneously moves to the right, an error-related potential will be generated. The ErrP is mostly
40
41 pronounced at frontal and central lobes and has a latency of 200-700ms. Figure 3 shows a
42 schematic of how an ErrP is generated and how it can be used to teach an intelligent agent to
43 control a BCI. The paradigm no longer relies on an average number of trials like in P300, but it
44
45 uses a short window in a single trial basis. Ferrez and Millan [139] decoded errors followed the
46 occurrence of miss recognition of user intent by the BCI system. Subsequently, Chavarriaga and
ce

47 Millan [140] utilized the ErrP to control an external autonomous device within the concept of
48
49 shared autonomy. The shared autonomy describes the situation where the user has only a
50 supervisory control over the action of a system upon which he/she has no control. Consistent
51 with the previous reports, they reported an ERP response located over the medial-frontal cortex
52
Ac

53
with a negative amplitude around 260ms after an error was detected by the subject. Moreover,
54 the amplitude of the ERP is inversely [140] modulated by the frequency of the autonomous
55 system error.
56
57
58 15
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 16 of 43

1
2
3
4

pt
5
6
7
8
9

cri
10
11
12
13
14
15
16

us
17
18
19
20
21
22
23
24
25
26
27
28
an
dM
29
30 Figure 3. A schematic of how an ErrP paradigm can be used in a BCI system (figure copied from [138]
31 with permission). Left: Detecting the existence of error and correct the last movement. Right: Using ErrP
32 in a learning process to update a BCI classifier.
33
34
35 A real-time and closed-loop BCI system can be regarded as a control problem. The ErrP can be
36 used to adjust the input control signals to the device. While in a traditional control system, the
37 adjustment is performed by the using linear or nonlinear controllers, in a BCI system where the
38
brain plays the role of controller, the adjustment can be automatically performed by the power of
pte

39
40 brain signals (for more information see review [141]). Finding a suitable controller in a
41 traditional control system has become a solvable problem; however, understanding brain-
42
43
controlled signals and translating them into logical and stable commands for usage in an external
44 device remains challenging. This investigation is further discussed in a study by Artusi et al.
45 [142].
46
ce

47 The process of using ErrP in a closed-loop BCI system could be considered as analogous to
48 “learn from your mistake.” In contrast to a traditional control system, in which error signal can
49
50 be sensed in milliseconds, the brain does not produce an ErrP until 200ms-700ms after the
51 subject receives feedback [139, 142]. The feedback is the relevant event whose onset engages the
52 brain circuits to process error-related information. The delay and non-stationarity of the signal
Ac

53
54 slows the system and makes real-time implementation difficult. Additionally, since the ErrP does
55 not contain any information about direction or magnitude, there is still the challenge of how to
56
57
58 16
59
60
Page 17 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 adjust command signals based on detected ErrP in a multi-degree-of-freedom control system.
4

pt
5 Thus, most BCI systems are designed using pre-learned algorithms to perform a task in a closed-
6 loop BCI [140, 143]. Recently, Iturrate et al. [144] developed a BCI system using the ErrP to
7 autonomously complete a task after a training time of several minutes. In their task, a brain-
8
9 controlled robotic arm learned how to reach a specific target based on a pre-learned algorithm

cri
10 using ErrP paradigm.
11
12 4.2 ErrP Analysis & Classification Methods
13
14 One approach to extracting the ErrP is to detect the discrepancy of the observed action and the
15 translated action in the BCI platform. Ferrez and Millan [139] found an interaction between the
16

us
subject and the BCI system. They observed positive peaks at 200ms and 450ms after feedback
17
18 and negative peaks 250ms and 450ms after feedback. They also observed that ErrP amplitude is
19 higher as the error rate decreases. Chavarriaga and Millan [140] investigated the consequences of
20
the subject monitoring an external agent that the subject does not have control over. They used a
21
22 cursor movement paradigm and estimated the posterior probability of moving the cursor in the
23
24
25
26
27
28
an
wrong direction as 𝑃𝑒𝑟𝑟 by classifying the EEG signal using a Gaussian classifier. They found
that electrode locations FCz and Cz were most closely correlated to the ErrP response.
Itturate et al. [145] designed a study where a subject observed a virtual robot performing a
reaching task. The subject was instructed to judge the robot motion based on prior information of
dM
29 the correct path. The averaged EEG waveforms at each electrode location were calculated, and
30 the results showed a significant difference between the correct and incorrect operation of the
31
32
robot. On error trials, a sharp positive peak at approximately 300ms was observed and was
33 followed by a negative peak at approximately 400ms. The averaged EEG waveforms were
34 derived in two steps: First, bipolar channels in the medial and posterior regions within the range
35
36
[150-700ms] were selected, offset components were removed, a bandpass filter of 0.5-10Hz was
37 applied, and the result was down-sampled to 64Hz; Second, they applied a Functional Decision
38 Tree in their AdaBoost classification algorithm to the resulting feature vector. Ten-Fold cross-
pte

39
40
validation suggested that the resulting averaged EEG waveforms distinguished between correct
41 and incorrect motion of a robot.
42
43 4.3 ErrP Applications and Targeted Patients
44
45 The use of ErrP in BCI systems was initially investigated by Ferrez and Millan [139].
46
Chavarriaga and Millan [140] employed the ErrP to allow a user to control and correct the
ce

47
48 behavior of an external autonomous system. In their approach, the user watched and maintained
49 supervisory control over the autonomous system in order to correct behavior of a system without
50
any direct or continuous control.
51
52 ErrP has been employed for robot reinforcement learning [145], 1D cursor control, [139, 140,
Ac

53
54 146-150] and 2D cursor control [144, 151]. Iturrate et al. [143] used ErrP with shared control for
55 a 2D reaching task. ErrP has also been used in BMI systems to control artificial [152] and robotic
56
57
58 17
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 18 of 43

1
2
3 arms, [149, 153], and it has been used to teach a robotic BMI system how to reach a particular
4

pt
5 target in a 2D plane [144].
6
7
The ErrP can provide additional information to improve closed-loop BCI systems. It is likely
8 that, in the future, the ErrP will allow a user to observe and spontaneously make the desired
9 change in a BCI system without the need for directly performing a control task [154, 155].

cri
10
11
12 5 Hybrid Paradigms
13
14 5.1 Overview
15
16
A hybrid paradigm refers to a combination of two or more physiological measures in which at

us
17 least one is EEG (for review see [156-158]). The other physiological measures could be other
18 bio-signals such as heart rate (ECG), eye movements (EOG) or hemodynamic signal recorded by
19
20
fNIRS [159]. In hybrid paradigms, sequential or simultaneous processing structures can be used
21 to output control commands to the BCI system [157]. Figure 4 shows a schematic of each
22
23
24
25
26
27
28
an
system. In the simultaneous processing configuration, bio-signals concurrently enter two (or
more) parallel decoding systems while in a sequential setting one decoding paradigm acts as a
gating function for another decoding system. Visual P300, SSVEP, and SMR paradigms are the
most prevalent paradigms in the development of hybrid BCI systems [82, 116].
dM
29
30 User Application
Paradigm Type 1 Paradigm Type 2
31 Interface Interface
32
33 (a)
34
35
36
37 Paradigm Type 1
38
pte

39
User Application
40
Interface Interface
41
42
43 Paradigm Type 2
44
45
46 (b)
ce

47
48 Figure 4. A schematic of two employed structures in hybrid BCI systems; a) Sequential form, b)
49 Simultaneous form.
50
51 In recent BCI studies, combining various mentioned paradigms or combining a BCI paradigm
52 with another interface has shown to enhanced BCI performance. For example, Luth et al. [160]
Ac

53 paired P300 and SSVEP in controlling an assistive robotic arm. In a 2D cursor task, Li et al.
54
55 [161] used Mu and Beta rhythms for controlling horizontal movement and P300 for vertical
56 movement. Bi et al. [162] also used a combination of SSVEP and P300. The SSVEP paradigm
57
58 18
59
60
Page 19 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 was used to extract directional information (clockwise/counterclockwise), and the P300 was used
4

pt
5 to decode the speed of the cursor. To minimize false positive rates of the user’s resting state,
6 Pfurtscheller et al. [137] introduced a hybrid BCI that combined of event-related synchronization
7 (ERS) and SSVEP collected from an EEG channel located above motor cortex and another
8
9 electrode located above visual cortex. Allison et al. [163] developed a 2D cursor control BCI

cri
10 incorporating SSVEP for decoding horizontal and event-related desynchronization (ERD) for
11 vertical movements.
12
13 5.2 Analysis & Classification Methods
14
15 Duan et al. [164] developed a hybrid BCI platform to control a robot to execute the grasp motion
16

us
using SSVEP, Mu rhythm, and feet motor imagery. A comparison between a single-paradigm
17
18 versus hybrid neurofeedback real-time BCI consisting of motor imagery and SSVEP were
19 reported by Yu et al. [165]. They used the Common Spatial Pattern (CSP) method to extract
20
maximally different mu and beta band powers for distinct classes of motor imagery and utilized
21
22 the CCA to decode flickering frequency. Hyung Kim et al. [166] combined EEG and eye
23
24
25
26
27
28
an
tracking for controlling a quadcopter. They discriminated two mental states of intentional
concentration and non-concentration using EEG signals. They applied CSP to filter EEG and
then utilized the Autoregressive (AR) model to estimate the spectral power of EEG from 11 Hz
to 19 Hz. The classification between two states of the model was performed by SVM, and it
dM
worked as a gating function to switch on the quadcopter. Afterwards, eye tracker was exploited
29
30 to control the direction of the drone. Kim et al. [167] utilized the same BCI platform in a
31 pointing and selection task. A summary of previous studies on hybrid BCI is shown in Table 4.
32 Further information in regard to hybrid BCIs can be seen in recent review articles [116, 156-158,
33
34 168, 169].
35
36
37 Table 4. An overview of previously published BCI hybrid paradigms.
38
pte

39 Reference Task Feature Classification method


40 [160] Control Rehabilitation Sequential P300 and SSVEP Matched filter, FFT
41 robotic devices
42
43 [161] 2D cursor control Simultaneous mu/Beta SVM
44 rhythm and P300
45 [137] Control an orthosis Sequential ERS and SSVEP FLDA
46
ce

47
[170] Control an artificial upper Simultaneous motor imagery CCA, FLDA
48
limb with 2 (Degree of and SSVEP
49 Freedom) DoF
50 [163] 2D cursor control Simultaneous ERD and LDA
51 SSVEP
52
Ac

53 [166] Quadcopter flight control Sequential motor imagery SVM


54 and eye tracking
55
56
57
58 19
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 20 of 43

1
2
3 [162] 2D cursor control Simultaneous SSVEP and RBF SVM, FLDA
4 P300

pt
5
6 [164] Robotic grasp control Sequential SSVEP and Mu CCA, STFT
rhythm
7
8 [171] Robotic control Sequential EOG and ERP LDA
9

cri
10
[172] Quadcopter flight control Simultaneous EEG and NIRS LDA
11
12
13 [165] Neurofeedback training Simultaneous motor imagery CCA, CSP+FFT
14 and SSVEP
15
16

us
17 5.3 Applications and Targeted Patients Population
18
19 Hybrid paradigms have been developed and applied to many BCI applications. Some studies
20
21 have used a combination of two EEG signals to control virtual objects and prosthetic devices.
22 For example, Bi et al. [162] used P300 and SSVEP paradigms to control a 2D computer cursor.
23
24
25
26
27
28
an
Allison et al. [163] used SMR and SSVEP paradigms to control a computer cursor in 2D space.
Li et al. [161] used SMR and P300 paradigms to control a 2D computer cursor. Horik et al. [170]
combined SMR and SSVEP to control a 2-DOF artificial upper limb. Also using SMR and
SSVEP, Duan et al. [164] controlled a humanoid robot to perform simple tasks. Pfurtscheller et
dM
29
al. [137] evaluated the feasibility of orthosis control using a combination of SSVEP and motor
30 imagery paradigms. Yu et al. [165] also used a combination of SSEVP and motor imagery to
31 enhance training performance for a motor imagery paradigm. Luth et al. [160] employed a hybrid
32
33
P300 and SSVEP for a low-level application of a semi-autonomous robotic rehabilitation system.
34 In other hybrid BCI systems, EEG is combined with other bio-signals such as EOG. For
35
36 example, Kim et al. [167] and Malechka et al. [173] developed wearable hybrid BCI systems
37 using EEG and an eye-tracking device. Kim et al. [166] employed their system with a motor
38 imagery paradigm to control a quadcopter in three-dimensional space. Ma et al. [171] developed
pte

39
40 a novel hybrid BCI using eye movements and the P300 ERP to control devices such as mobile
41 robots and humanoid robots. Other studies have combined EEG paradigms with other
42 neuroimaging techniques (e.g., fNIRS) for communication purposes in ALS and monitoring of
43
44 patients vigilance state [159] and to control external devices such as quadcopters [172].
45
46 6 Other Paradigms
ce

47
48 In addition to the most common BCI paradigms detailed above, other paradigms have been
49
examined in a limited number of studies. Table 5 shows a number of previously generated EEG-
50
51 based BCI paradigms and a brief description of each system. Among the paradigms shown in
52 Table 5, the “covert and overt attention,” “discrete movement intention” and “auditory
Ac

53 paradigm” paradigms have shown promise as BCI devices.


54
55
56
57
58 20
59
60
Page 21 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 Covert and Overt Attention Paradigm: Hunt and Kingstone [174] were among the first to use a
4

pt
5 covert attention BCI paradigm. They discovered the existence of a dissociation between
6 voluntary shifts in overt and covert attention. In a covert attention paradigm, the subject is
7 instructed to look at a centrally located fixation point. The subject’s task is to follow another
8
9 point (e.g., cursor) without overt eye movement. In contrast to covert attention, an overt attention

cri
10 task the subject is instructed to use overt eye movements while they attend to a moving object.
11 Both of these approaches depend on visual attention, and the EEG signals are typically recorded
12
13 from the posterior cortex. Additional studies using this paradigm were performed by Kelly et al.
14 [175, 176]. In [176], they investigated Parieto-occipital alpha band (8-14Hz) EEG activity in a
15 covert attention paradigm to classify the spatial attention to the right and left. Later, they
16

us
17 confirmed the existence of distinct patterns in overt and covert attention during preparatory
18 processes [175]. Tonin and colleagues [177, 178] used a covert attention paradigm in a 2-class
19 classification problem (i.e., attention to right corner target of a monitor vs. attention to left corner
20
21
target of a monitor) to control a BCI system in online mode and provide feedback to the subject
22 by showing the result of classification. Additionally, Treder et al. [179] employed a covert
23
24
25
26
27
28
an
attention paradigm for a two-dimensional BCI control to covertly choose a target among six
targets which are equally distributed around a circle on a screen.
Discrete Movement Intention Paradigm: In the movement intention paradigm, EEG signals
collected before movement onset are used to detect the intended movement of a BCI user and
dM
29 manipulate the environment accordingly. In these studies, the subject may or may not be able to
30 physically execute an actual movement. However, their EEG signals can confirm the intention of
31
32 movement before movement occurs [180]. In some studies, the terminology “attempted” [181] or
33 “planned” [182] movement is used to describe the intention of movement. This paradigm can be
34 primarily and fruitfully used in motor rehabilitation. By using the movement intention paradigm
35
36 in robotic rehabilitation, a patient’s intentions can initiate the movement of a robot. Frisoli et al.
37 [183] used a gaze-dependent variation of this paradigm for upper limb rehabilitation. EEG
38 signals were used to adjust jerk, acceleration, and speed of the exoskeleton. As a means of
pte

39
40
therapy for post-stroke patients, Muralidharan et al. [181] successfully extracted intention from
41 EEG signals to open or close a paretic left/right hand. A similar study by Lew et al. [184] was
42 performed using two able-bodied subjects and two post-stroke patients with an overall success
43
44
rate of 80% in detection of movement. Investigation of EEG signals for the intention of the right-
45 hand and left-hand movements was performed by Bai et al. [185]. Bai et al. [180] predicted wrist
46 extension movements in seven healthy subjects. Zhou et al. [186] classified the information from
ce

47
48
EEG signals during the moment in which the subjects (four healthy, two stroke) intended to
49 perform shoulder abduction or elbow flexion movements. Also, EEG data were analyzed for a
50 chronic stroke patient before the onset of hand movement toward a target [187].
51
52 Auditory Paradigm: Auditory paradigms have also been investigated by a number of BCI
Ac

53 researchers [83]. Brain signals can be modulated either by using an intention-driven


54
55 (endogenous) BCI or stimulus-driven (exogenous) BCI depending on the paradigm. For
56 example, auditory P300 [188] considered as an exogenous stimulation is used to evoke auditory
57
58 21
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 22 of 43

1
2
3 steady-state responses (ASSR) [189]. ASSR is an auditory evoked potential in response to rapid
4

pt
5 auditory stimuli; Picton et al. [189] showed that the ASSR maximum amplitude is recorded from
6 the vertex of the scalp. Sellers and Donchin [188] compared P300 auditory and visual paradigms
7 in patients with ALS. Although they showed proof of principle with the auditory P300 BCI,
8
9 performance was significantly better in the visual condition. Nijboer et al. [84] also validated the

cri
10 feasibility of an auditory-based BCI by comparing with visual-based BCI. Ferracuti et al. [190]
11 used a novel paradigm where five classes of auditory stimuli were presented in five different
12
13 locations of space.
14
Somatosensory (Tactile) paradigm: In recent years, the usage of a somatosensory paradigms for
15
16 patients with visual impairment has become popular. In this paradigm, vibrotactile sensors are

us
17 located in pre-determined parts of body while stimulations happen at different frequencies [191].
18
The stimulations of these sensors will be reflected on EEG signals recorded from the scalp.
19
20 Muller-Putz et al. [124] investigated the usability of the steady-state somatosensory evoked
21 potential paradigm. Other researchers employed tactile P300 paradigms in their BCI systems
22
23
24
25
26
27
28
an
[192]. Imagined tactile paradigms were also investigated by Yao et al. [85]. The somatosensory
paradigm was utilized in assisting patients with locked-in syndrome [193, 194].
Reflexive Semantic Conditioning Paradigm: BCIs for communication purposes have been
developed since the late eighties; however, it remains a great challenge to provide reliable results
for people with severe motor disabilities, such as completely locked-in syndrome (CLIS). A
dM
29
30
paradigm named "reflexive semantic conditioning" (based on Pavlov theory) was developed and
31 tested in healthy participants as well as in people with diagnosis of ALS. The main goal of the
32 paradigm is to deal with communication problems in CLIS and ALS patients [195-199].
33
34
35
36 Table 5. Less Common EEG-based BCI paradigms.
37
References Paradigm Description
38
pte

39 [174], [175], [176], Overt (Covert) attention paradigm: the EEG signals are generated
40 [177], [178], [179] through overt (eye movement) or covert (eye fixation) attention on
41 movements of a cursor on a screen.
42
43 [180], [181], [182], Discrete movement intention paradigm: using recorded EEG signals,
44 [183], [184], [185], intention of subject is decoded prior to performing a task. It is a
45 [186], [187], [200], popular paradigm in Rehabilitative robotics.
46 [201], [202], [203], [204]
ce

47
48 [83], [84], [189], [188], Auditory paradigm: the origin of EEG signals is related to an
49 [190], [205], [206], [207] external sound stimulus. The potential future application could be for
50 aural prostheses.
51
52 [208] Olfactory paradigm: smelling/remembering an odor could cause
Ac

53 distinguishable changes in EEG signals.


54 [209], [210], [211] Real movement paradigm: EEG signals are recorded (used for
55
control) while subject is performing real movement.
56
57
58 22
59
60
Page 23 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [85], [191], [212], [124], Somatosensory (Tactile) paradigm: tactile sensors are used to
4 [192], [193], [194], [213] stimulate parts of body (in different frequency) while the EEG

pt
5
signals are recorded for classification and generating control
6
commands.
7
8 [154], [155] Passive paradigm: passive EEG signals without the purpose of
9 voluntary control, such as the user’s intentions, situational

cri
10 interpretations, and emotional states, are utilized as a complementary
11
BCI.
12
13 [214] Non-motor mental imagery paradigm: EEG signals origin from non-
14 motor imaginary tasks such as math calculation.
15
16 [215], [19], [216], [217], Slow cortical potentials paradigm: low frequency EEG signals

us
17 [218] recorded from prefrontal cortex are modulated through a long
18 training time of a cognitive task while receiving neurofeedback, as
19 well.
20
21 [219], [220], [221] Observation-based paradigm: EEG signals are collected while the
22 subject observes different actions performed by an external device
23
24
25
26
27
28
[222], [223], [224]
an
(such as prosthetic hand).
Eye-movement paradigm: EEG signals are recorded while the
subject is instructed to have eye movement to different directions.
Discrete classes are extracted from EEG signals for controlling
external objects.
dM
29 [195], [196], [197], Reflexive Semantic Conditioning Paradigm: The EEG signals is
30
[198], [199] modulated by presenting various statements. The paradigm is
31
primarily used for communication in ALS and CLIS populations.
32
33
34
35 7 Current Issues and Future Considerations
36
37 In recent years, BCI research has made significant progress in neurorehabilitation and assistive
38
device technology. Each of the methodologies presented in this review has promise as brain-
pte

39
40 controlled external prosthetic devices for spinal cord injury patients and other with severe
41 communication disorders such as ALS, LIS, and Multiple Sclerosis (MS). No doubt, there is a
42
43
strong possibility that BCI systems will be commercialized shortly. In fact, a limited number of
44 commercial devices are already available. Some programs such as the BNCI Horizon 2020
45 project [225] has established a future roadmap for BCI systems. Nevertheless, there are critical
46
limitations, challenges, and issues related to BCI paradigms and platforms that should be
ce

47
48 addressed and considered by the BCI community. It is a common practice in the BCI literature to
49 report the results of a study in term of classification accuracy. Few publications address issues
50
such as reliability of the platforms. Also, it is often not clear what are the behavioral, cognitive,
51
52 sensory, and motor functional outcomes in a BCI study. To further advance BCI research for
Ac

53 practical applications, we believe these important issues should be addressed in future work.
54
55
56
57
58 23
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 24 of 43

1
2
3 Training Time and Fatigue: One of the most significant challenges in BCI is the training
4

pt
5 required for a subject to become proficient with the system. Most paradigms have lengthy
6 training times, which can cause fatigue in subjects. Although there are examples of long-term use
7 of stimulus-based BCI such as [112, 226], overall external stimulus paradigms such as P300-
8
9 based systems may cause fatigue over extended periods of use. Moreover, subject-dependency

cri
10 and even inter-session variability can make it necessary for BCI researchers to collect calibration
11 data at the beginning of each session. To mitigate this problem, some recent studies have used
12
13 methods such as transfer learning to develop a zero training/generic BCI model that generalizes
14 to most subjects [81, 227-230].
15
16 Signal Processing and Novel Decoders: Many different decoding methods, signal processing

us
17 algorithms [231], and classification algorithms [30] have been recently investigated.
18
Nevertheless, the information extracted from EEG signals does not have a high enough signal-to-
19
20 noise ratio to control a system such as a neuroprosthetic arm with multiple degrees of freedom.
21 More robust, accurate, and fast online algorithms are required to be able to control a multi-DOF
22
23
24
25
26
27
28
an
system. In recent years, some researchers have suggested that source localization of EEG [232]
and active data selection [233] can improve classification performance. Other researchers have
suggested the use of advanced machine learning and deep learning methods [234-237], which
have potential to extract additional features that can improve classification. Furthermore, other
researchers have proposed adaptive classifiers and decoders in order to compensate for the non-
dM
29 stationary nature of EEG signals [238]. Meanwhile, a particular standardization system is
30 essential to evaluate the performance of decoding algorithms in specific applications and BCI
31
32 systems [239].
33
From Shared Control to Supervisory Control in Closed-Loop BCIs: A closed-loop BCI is
34
35 considered to be a co-adaptive and mutual learning system where the human and computer learn
36 from each other, while adaptation in mathematical algorithms and neural circuits also occurs.
37
Millan [240] described the closed-loop BCI system as a “two-learner system.” The terms “shared
38
control” and “hybrid control” were also used to describe the contributions of both human and
pte

39
40 machine in performing the control task [20, 55, 143, 241-243]. The shared BCI system includes
41 both high-level and low-level control systems. High-level control commands are generated by
42
43 the brain and traditional control systems are responsible for low-level control tasks. Interestingly,
44 in high-level control, there is always a tradeoff between the natural way of control and subject
45 fatigue. The ideal BCI system with mutual interaction can be described as a supervisory control
46
system in which the subject is the leader with minimum involvement (in high-level control), and
ce

47
48 the BCI system serves as an intelligent system (in low-level control) [140, 244]. By cognitive
49 monitoring, the user can act as a supervisor of the external autonomous system instead of
50
51 continuously interacting with control commands.
52
The definition of a closed-loop control system is currently a controversial issue [141, 245]. In
Ac

53
54 reality, in an EEG-based BCI, some types of artificial sensory feedback, except visual feedback
55 [246], should be considered to provide the subject with the highest feeling of control in a closed-
56
57
58 24
59
60
Page 25 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 loop form. In contrast, invasively controlled prosthetic arms include the sense of touch, which
4

pt
5 increases the perception of a closed-loop control system [247]. In EEG-based BCI platforms,
6 various feedback mechanisms have been investigated, including brain stimulation [35], reaction
7 force [248], and somatosensory stimulation [42].
8
9 Development of New EEG Technologies: Since scalp EEG is categorized as low-cost and

cri
10
affordable technology among brain monitoring technologies, it has the potential to be
11
12 commercialized for general public [3]. There are studies to determine alertness/drowsiness from
13 brain dynamics while evaluating behavioral changes with applications to drowsy driving. Having
14
a portable EEG headset helps understand the brain dynamics underlying integration of perceptual
15
16 functions of the brain in different scenarios. Some studies evaluate behavioral changes in

us
17 response to auditory signals in a driving environment and find correlations between brainwaves
18
and other sensory inputs such as haptic feedback. As part of development for this technology
19
20 many researchers have investigated the development of wearable and wireless EEG headsets
21 [173, 249]. Dry EEG sensors have also developed [250-253]. These sensors do not require skin
22
23
24
25
26
27
28
an
preparation or gel applications that are required of conventional wet sensors. The development of
these new EEG headsets could facilitate the application of BCIs beyond current levels. For
example, a forehead EEG-based BCI can be used as a sleep management system that can assess
sleep quality. The device could also be used as a depression treatment screening system that
could evaluate and predict the efficacy of rapid antidepressant agents. Nevertheless, there are
dM
29 still limitations to dry electrode technology. For example, the sensors are uncomfortable to the
30 scalp and they are very sensitive to muscle and movement artifacts. In addition, current dry
31
32 headsets recording quality typically degrades after approximately one hour.
33
Neurofeedback and the Future Paradigms: One future direction of BCI is its application in
34
35 neurofeedback [254]. Neurofeedback, a type of biofeedback, is the process of self-regulating
36 brainwaves to improve various aspects of cognitive control. In some cases, neurofeedback-based
37
BCIs could potentially replace medications, thereby reducing the negative side effects of
38
medication. For example, this technology could help to alleviate cognitive and pathological
pte

39
40 neural diseases, such as migraine headaches. A headache detection and management system can
41 notify migraine patients’ imminent migraine headaches days in advance while offering a
42
43 treatment in neurofeedback form. Neurofeedback-based BCIs could also be developed to assist
44 the treatment of people with addiction, obesity, autism, and asthma [255]. New EEG paradigms
45 can also be developed to facilitate cognitive control [256] and interaction with the environment
46
[154, 155]. For instance, ErrP can be used as a useful mechanism to enhance neurofeedback
ce

47
48 since it allows a user to observe and spontaneously make the desired change in a BCI system
49 without the need to directly perform a control task. Moreover, new cognitive models of
50
51 neurofeedback can be developed for neuro-rehabilitation of cognitive deficits, such as ADHD,
52 anxiety, epilepsy, Alzheimer’s disease, traumatic brain injury, and post-traumatic stress disorder
Ac

53 [257-263].
54
55
56
57
58 25
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 26 of 43

1
2
3
4
8 Conclusions

pt
5 Currently, there is a high level of interest in non-invasive BCI technology. Many variables have
6
7 facilitated the popularity of these systems. Because of wireless recording, low-cost amplifiers,
8 higher temporal resolution, and advanced signal analysis methodology, the systems are more
9 accessible to researchers in many scientific domains. As described in this review, a critical aspect

cri
10
11 of employing a BCI system is to match the appropriate control signal with the desired
12 application. It is essential to choose the most reliable, accurate, and convenient paradigm to
13 manipulate a neuroprosthetic device or implement a specific neurorehabilitation program. The
14
15 current review has evaluated several EEG-based BCI paradigms in terms of their advantages and
16 disadvantages from a variety of perspectives. Each paradigm was described and presented in

us
17 terms of the control signals, various EEG decoding algorithms, and classification methods, and
18
19 target populations of each paradigm were summarized. Finally, potential problems with EEG-
20 based BCI systems were discussed, and possible solutions were proposed.
21
22
23
24
25
26
27
28
Acknowledgments

an
The authors are grateful to Dr. Jose Millan for his insightful comments to an early draft of this
manuscript. The assistance of Megan Pitz to the manuscript is also appreciated. This work was
partially supported by NeuorNET at UTK.
dM
29
30
Author Contributions
31 R.A. and S.B. contributed equally and have shared first authorship. E.S. and Y.J. revised the
32
33 paper and contributed with insightful comments. X.Z. was involved in all aspects of the study.
34
35 Competing Financial Interests
36
37 The authors declare no competing financial interests.
38
pte

39
40
References
41 [1] L. M. Nirenberg, J. Hanley, and E. B. Stear, "A New Approach to Prosthetic Control:
42
43
EEG Motor Signal Tracking With an Adaptively Designed Phase-Locked Loop,"
44 Biomedical Engineering, IEEE Transactions on, vol. BME-18, no. 6, pp. 389-398, 1971.
45 [2] J. R. Wolpaw, N. Birbaumer, D. J. McFarland, G. Pfurtscheller, and T. M. Vaughan,
46 "Brain-computer interfaces for communication and control," (in eng), Clin Neurophysiol,
ce

47 vol. 113, no. 6, pp. 767-91, Jun 2002.


48 [3] L. F. Nicolas-Alonso and J. Gomez-Gil, "Brain computer interfaces, a review," Sensors,
49
vol. 12, no. 2, pp. 1211-1279, 2012.
50
51 [4] M. A. Lebedev and M. A. Nicolelis, "Brain-Machine Interfaces: From Basic Science to
52 Neuroprostheses and Neurorehabilitation," Physiological Reviews, vol. 97, no. 2, pp. 767-
Ac

53 837, 2017.
54 [5] R. A. Ramadan and A. V. Vasilakos, "Brain computer interface: control signals review,"
55 Neurocomputing, vol. 223, pp. 26-44, 2017.
56
57
58 26
59
60
Page 27 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [6] S. Waldert, T. Pistohl, C. Braun, T. Ball, A. Aertsen, and C. Mehring, "A review on
4

pt
5
directional information in neural signals for brain-machine interfaces," Journal of
6 Physiology-Paris, vol. 103, no. 3, pp. 244-254, 2009.
7 [7] M. W. Slutzky and R. D. Flint, "Physiological properties of brain-machine interface input
8 signals," Journal of neurophysiology, vol. 118, no. 2, pp. 1329-1343, 2017.
9 [8] D. W. Moran and A. B. Schwartz, "Motor cortical representation of speed and direction

cri
10 during reaching," Journal of Neurophysiology, vol. 82, no. 5, pp. 2676-2692, 1999.
11
[9] L. R. Hochberg et al., "Neuronal ensemble control of prosthetic devices by a human with
12
13
tetraplegia," Nature, vol. 442, no. 7099, pp. 164-171, 2006.
14 [10] S.-P. Kim, J. D. Simeral, L. R. Hochberg, J. P. Donoghue, and M. J. Black, "Neural
15 control of computer cursor velocity by decoding motor cortical spiking activity in
16 humans with tetraplegia," Journal of neural engineering, vol. 5, no. 4, p. 455, 2008.

us
17 [11] G. H. Mulliken, S. Musallam, and R. A. Andersen, "Decoding trajectories from posterior
18 parietal cortex ensembles," the Journal of Neuroscience, vol. 28, no. 48, pp. 12913-
19
12926, 2008.
20
21 [12] M. Hauschild, G. H. Mulliken, I. Fineman, G. E. Loeb, and R. A. Andersen, "Cognitive
22 signals for brain–machine interfaces in posterior parietal cortex include continuous 3D
23
24
25
26
27
28
[13]

[14]
42, pp. 17075-17080, 2012.
an
trajectory commands," Proceedings of the National Academy of Sciences, vol. 109, no.

L. R. Hochberg et al., "Reach and grasp by people with tetraplegia using a neurally
controlled robotic arm," Nature, vol. 485, no. 7398, pp. 372-375, 2012.
S.-P. Kim, J. D. Simeral, L. R. Hochberg, J. P. Donoghue, G. M. Friehs, and M. J. Black,
dM
29 "Point-and-click cursor control with an intracortical neural interface system by humans
30 with tetraplegia," Neural Systems and Rehabilitation Engineering, IEEE Transactions on,
31 vol. 19, no. 2, pp. 193-203, 2011.
32 [15] J. Vogel et al., "An assistive decision-and-control architecture for force-sensitive hand–
33 arm systems driven by human–machine interfaces," The International Journal of
34
Robotics Research, vol. 34, no. 6, pp. 763-780, 2015.
35
36
[16] D. M. Taylor, S. I. H. Tillery, and A. B. Schwartz, "Direct cortical control of 3D
37 neuroprosthetic devices," Science, vol. 296, no. 5574, pp. 1829-1832, 2002.
38 [17] M. Velliste, S. Perel, M. C. Spalding, A. S. Whitford, and A. B. Schwartz, "Cortical
pte

39 control of a prosthetic arm for self-feeding," Nature, vol. 453, no. 7198, pp. 1098-1101,
40 2008.
41 [18] L. Bi, X.-A. Fan, and Y. Liu, "EEG-based brain-controlled mobile robots: a survey,"
42
Human-Machine Systems, IEEE Transactions on, vol. 43, no. 2, pp. 161-176, 2013.
43
44 [19] N. Birbaumer et al., "A spelling device for the paralysed," Nature, vol. 398, no. 6725, pp.
45 297-298, 1999.
46 [20] J. Meng, S. Zhang, A. Bekyo, J. Olsoe, B. Baxter, and B. He, "Noninvasive
ce

47 Electroencephalogram Based Control of a Robotic Arm for Reach and Grasp Tasks,"
48 Scientific Reports, vol. 6, p. 38565, 2016.
49 [21] J. J. Daly and J. R. Wolpaw, "Brain-computer interfaces in neurological rehabilitation,"
50
51
(in eng), Lancet Neurol, vol. 7, no. 11, pp. 1032-43, Nov 2008.
52 [22] N. Birbaumer and L. G. Cohen, "Brain–computer interfaces: communication and
Ac

53 restoration of movement in paralysis," The Journal of Physiology, vol. 579, no. 3, pp.
54 621-636, March 15, 2007 2007.
55
56
57
58 27
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 28 of 43

1
2
3 [23] S. Machado, L. F. Almada, and R. N. Annavarapu, "Progress and Prospects in EEG-
4

pt
5
Based Brain-Computer Interface: Clinical Applications in Neurorehabilitation," Journal
6 of Rehabilitation Robotics, vol. 1, no. 1, pp. 28-41, 2013.
7 [24] S. Moghimi, A. Kushki, A. Marie Guerguerian, and T. Chau, "A Review of EEG-Based
8 Brain-Computer Interfaces as Access Pathways for Individuals with Severe Disabilities,"
9 Assistive Technology, vol. 25, no. 2, pp. 99-110, 2013/04/03 2012.

cri
10 [25] N. Birbaumer, "Breaking the silence: brain-computer interfaces (BCI) for communication
11
and motor control," (in eng), Psychophysiology, vol. 43, no. 6, pp. 517-32, Nov 2006.
12
13
[26] J. L. Contreras-Vidal, A. Presacco, H. Agashe, and A. Paek, "Restoration of Whole Body
14 Movement: Toward a Noninvasive Brain-Machine Interface System," Pulse, IEEE, vol.
15 3, no. 1, pp. 34-37, 2012.
16 [27] T. Mulder, "Motor imagery and action observation: cognitive tools for rehabilitation,"

us
17 Journal of neural transmission, vol. 114, no. 10, pp. 1265-1278, 2007.
18 [28] T. M. Vaughan, J. R. Wolpaw, and E. Donchin, "EEG-based communication: prospects
19
and problems," IEEE transactions on rehabilitation engineering, vol. 4, no. 4, pp. 425-
20
21 430, 1996.
22 [29] H.-J. Hwang, S. Kim, S. Choi, and C.-H. Im, "EEG-Based Brain-Computer Interfaces: A
23
24
25
26
27
28
[30] an
Thorough Literature Survey," International Journal of Human-Computer Interaction,
vol. 29, no. 12, pp. 814-826, 2013/12/02 2013.
F. Lotte, M. Congedo, A. Lécuyer, F. Lamarche, and B. Arnaldi, "A review of
classification algorithms for EEG-based brain–computer interfaces," Journal of neural
engineering, vol. 4, no. 2, p. R1, 2007.
dM
29 [31] G. Pfurtscheller, B. Graimann, and C. Neuper, "EEG‐ Based Brain‐ Computer Interface
30 System," Wiley Encyclopedia of Biomedical Engineering, 2006.
31 [32] S. Machado et al., "EEG-based brain-computer interfaces: an overview of basic concepts
32 and clinical applications in neurorehabilitation," Reviews in the neurosciences, vol. 21,
33 no. 6, pp. 451-468, 2010.
34
[33] G. Pfurtscheller and C. Neuper, "Motor imagery activates primary sensorimotor area in
35
36
humans," Neuroscience letters, vol. 239, no. 2, pp. 65-68, 1997.
37 [34] H. Yuan and B. He, "Brain-Computer Interfaces Using Sensorimotor Rhythms: Current
38 State and Future Perspectives," 2014.
pte

39 [35] B. He, B. Baxter, B. J. Edelman, C. C. Cline, and W. Y. Wenjing, "Noninvasive brain-


40 computer interfaces based on sensorimotor rhythms," Proceedings of the IEEE, vol. 103,
41 no. 6, pp. 907-925, 2015.
42
[36] V. Morash, O. Bai, S. Furlani, P. Lin, and M. Hallett, "Classifying EEG signals preceding
43
44 right hand, left hand, tongue, and right foot movements and motor imageries," Clinical
45 neurophysiology, vol. 119, no. 11, pp. 2570-2578, 2008.
46 [37] G. Pfurtscheller and F. L. Da Silva, "Event-related EEG/MEG synchronization and
ce

47 desynchronization: basic principles," Clinical neurophysiology, vol. 110, no. 11, pp.
48 1842-1857, 1999.
49 [38] J. R. Wolpaw, D. J. McFarland, G. W. Neat, and C. A. Forneris, "An EEG-based brain-
50
51
computer interface for cursor control," Electroencephalography and Clinical
52 Neurophysiology, vol. 78, no. 3, pp. 252-259, 3// 1991.
Ac

53 [39] K. LaFleur, K. Cassady, A. Doud, K. Shades, E. Rogin, and B. He, "Quadcopter control
54 in three-dimensional space using a noninvasive motor imagery-based brain–computer
55 interface," Journal of neural engineering, vol. 10, no. 4, p. 046003, 2013.
56
57
58 28
59
60
Page 29 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [40] J. R. Wolpaw and D. J. McFarland, "Control of a two-dimensional movement signal by a
4

pt
5
noninvasive brain-computer interface in humans," Proceedings of the National Academy
6 of Sciences of the United States of America, vol. 101, no. 51, pp. 17849-17854, 2004.
7 [41] S. Bhattacharyya, A. Khasnobish, A. Konar, D. N. Tibarewala, and A. K. Nagar,
8 "Performance analysis of left/right hand movement classification from EEG signal by
9 intelligent algorithms," in Computational Intelligence, Cognitive Algorithms, Mind, and

cri
10 Brain (CCMB), 2011 IEEE Symposium on, 2011, pp. 1-8.
11
[42] A. R. Murguialday et al., "Brain-Computer Interface for a Prosthetic Hand Using Local
12
13
Machine Control and Haptic Feedback," in Rehabilitation Robotics, 2007. ICORR 2007.
14 IEEE 10th International Conference on, 2007, pp. 609-613.
15 [43] C.-W. Chen, C.-C. K. Lin, and M.-S. Ju, "Hand orthosis controlled using brain-computer
16 interface," Journal of Medical and Biological Engineering, vol. 29, no. 5, pp. 234-241,

us
17 2009.
18 [44] J. R. Wolpaw and D. J. McFarland, "Multichannel EEG-based brain-computer
19
communication," Electroencephalography and Clinical Neurophysiology, vol. 90, no. 6,
20
21 pp. 444-449, 6// 1994.
22 [45] G. R. Muller-Putz, R. Scherer, G. Pfurtscheller, and R. Rupp, "EEG-based
23
24
25
26
27
28
[46] an
neuroprosthesis control: a step towards clinical practice," (in eng), Neurosci Lett, vol.
382, no. 1-2, pp. 169-74, Jul 1-8 2005.
K. K. Ang et al., "A clinical study of motor imagery-based brain-computer interface for
upper limb robotic rehabilitation," in Engineering in Medicine and Biology Society, 2009.
EMBC 2009. Annual International Conference of the IEEE, 2009, pp. 5981-5984: IEEE.
dM
29 [47] B. S. Baxter, A. Decker, and B. He, "Noninvasive control of a robotic arm in multiple
30 dimensions using scalp electroencephalogram," in Neural Engineering (NER), 2013 6th
31 International IEEE/EMBS Conference on, 2013, pp. 45-47: IEEE.
32 [48] M. Sarac, E. Koyas, A. Erdogan, M. Cetin, and V. Patoglu, "Brain Computer Interface
33 based robotic rehabilitation with online modification of task speed," (in eng), IEEE Int
34
Conf Rehabil Robot, vol. 2013, p. 6650423, Jun 2013.
35
36
[49] D. J. McFarland, W. A. Sarnacki, and J. R. Wolpaw, "Electroencephalographic (EEG)
37 control of three-dimensional movement," (in eng), J Neural Eng, vol. 7, no. 3, p. 036007,
38 Jun 2010.
pte

39 [50] C. Guger, W. Harkam, C. Hertnaes, and G. Pfurtscheller, "Prosthetic control by an EEG-


40 based brain-computer interface (BCI)," in Proc. aaate 5th european conference for the
41 advancement of assistive technology, 1999, pp. 3-6.
42
[51] G. Pfurtscheller, G. R. Müller, J. Pfurtscheller, H. J. Gerner, and R. Rupp, "‘Thought’ –
43
44 control of functional electrical stimulation to restore hand grasp in a patient with
45 tetraplegia," Neuroscience Letters, vol. 351, no. 1, pp. 33-36, 11/6/ 2003.
46 [52] S. Aiqin, F. Binghui, and J. Chaochuan, "Motor imagery EEG-based online control
ce

47 system for upper artificial limb," in Transportation, Mechanical, and Electrical


48 Engineering (TMEE), 2011 International Conference on, 2011, pp. 1646-1649.
49 [53] R. Roy, A. Konar, D. N. Tibarewala, and R. Janarthanan, "EEG driven model predictive
50
51
position control of an artificial limb using neural net," in Computing Communication &
52 Networking Technologies (ICCCNT), 2012 Third International Conference on, 2012, pp.
Ac

53 1-9.
54
55
56
57
58 29
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 30 of 43

1
2
3 [54] A. J. Doud, J. P. Lucas, M. T. Pisansky, and B. He, "Continuous three-dimensional
4

pt
5
control of a virtual helicopter using a motor imagery based brain-computer interface,"
6 PloS one, vol. 6, no. 10, p. e26322, 2011.
7 [55] T. Li, J. Hong, J. Zhang, and F. Guo, "Brain–machine interface control of a manipulator
8 using small-world neural network and shared control strategy," Journal of neuroscience
9 methods, vol. 224, pp. 26-38, 2014.

cri
10 [56] S. Fok et al., "An EEG-based brain computer interface for rehabilitation and restoration
11
of hand control following stroke using ipsilateral cortical physiology," (in eng), Conf
12
13
Proc IEEE Eng Med Biol Soc, vol. 2011, pp. 6277-80, 2011.
14 [57] C. Wang et al., "A feasibility study of non-invasive motor-imagery BCI-based robotic
15 rehabilitation for Stroke patients," in Neural Engineering, 2009. NER'09. 4th
16 International IEEE/EMBS Conference on, 2009, pp. 271-274: IEEE.

us
17 [58] E. Buch et al., "Think to move: a neuromagnetic brain-computer interface (BCI) system
18 for chronic stroke," stroke, vol. 39, no. 3, pp. 910-917, 2008.
19
[59] A. Ramos-Murguialday et al., "Proprioceptive feedback and brain computer interface
20
21 (BCI) based neuroprostheses," PloS one, vol. 7, no. 10, p. e47048, 2012.
22 [60] A. Ramos‐ Murguialday et al., "Brain–machine interface in chronic stroke rehabilitation:
23
24
25
26
27
28
[61]

[62]
an
a controlled study," Annals of neurology, vol. 74, no. 1, pp. 100-108, 2013.
V. Kaiser, I. Daly, F. Pichiorri, D. Mattia, G. R. Müller-Putz, and C. Neuper,
"Relationship between electrical brain responses to motor imagery and motor impairment
in stroke," Stroke, vol. 43, no. 10, pp. 2735-2740, 2012.
J. Pereira, P. Ofner, A. Schwarz, A. I. Sburlea, and G. R. Müller-Putz, "EEG neural
dM
29 correlates of goal-directed movement intention," NeuroImage, vol. 149, pp. 129-140,
30 2017.
31 [63] T. J. Bradberry, R. J. Gentili, and J. L. Contreras-Vidal, "Fast attainment of computer
32 cursor control with noninvasively acquired brain signals," Journal of neural engineering,
33 vol. 8, no. 3, p. 036010, 2011.
34
[64] P. Ofner and G. R. Müller-Putz, "EEG-Based Classification of Imagined Arm
35
36
Trajectories," in Replace, Repair, Restore, Relieve–Bridging Clinical and Engineering
37 Solutions in Neurorehabilitation: Springer, 2014, pp. 611-620.
38 [65] J.-H. Kim, F. Biessmann, and S.-W. Lee, "Decoding Three-Dimensional Trajectory of
pte

39 Executed and Imagined Arm Movements from Electroencephalogram Signals," 2014.


40 [66] A. Úbeda, J. M. Azorín, R. Chavarriaga, and J. d. R. Millán, "Evaluating decoding
41 performance of upper limb imagined trajectories during center-out reaching tasks," in
42
Systems, Man, and Cybernetics (SMC), 2016 IEEE International Conference on, 2016,
43
44 pp. 000252-000257: IEEE.
45 [67] Y. Gu, K. Dremstrup, and D. Farina, "Single-trial discrimination of type and speed of
46 wrist movements from EEG recordings," Clinical Neurophysiology, vol. 120, no. 8, pp.
ce

47 1596-1600, 8// 2009.


48 [68] Y. Gu, D. Farina, A. R. Murguialday, K. Dremstrup, P. Montoya, and N. Birbaumer,
49 "Offline identification of imagined speed of wrist movements in paralyzed ALS patients
50
51
from single-trial EEG," (in English), Frontiers in Neuroscience, Original Research vol. 3,
52 2009-August-10 2009.
Ac

53 [69] A. Vuckovic and F. Sepulveda, "Delta band contribution in cue based single trial
54 classification of real and imaginary wrist movements," (in eng), Med Biol Eng Comput,
55 vol. 46, no. 6, pp. 529-39, Jun 2008.
56
57
58 30
59
60
Page 31 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [70] T. Chakraborti et al., "Implementation of EEG based control of remote robotic systems,"
4

pt
5
in Recent Trends in Information Systems (ReTIS), 2011 International Conference on,
6 2011, pp. 203-208: IEEE.
7 [71] A. K. Mohamed, T. Marwala, and L. R. John, "Single-trial EEG discrimination between
8 wrist and finger movement imagery and execution in a sensorimotor BCI," (in eng), Conf
9 Proc IEEE Eng Med Biol Soc, vol. 2011, pp. 6289-93, 2011.

cri
10 [72] T. J. Bradberry, R. J. Gentili, and J. L. Contreras-Vidal, "Reconstructing Three-
11
Dimensional Hand Movements from Noninvasive Electroencephalographic Signals," The
12
13
Journal of Neuroscience, vol. 30, no. 9, pp. 3432-3437, March 3, 2010 2010.
14 [73] P. Ofner and G. R. Müller-Putz, "Using a noninvasive decoding method to classify
15 rhythmic movement imaginations of the arm in two planes," IEEE Transactions on
16 Biomedical Engineering, vol. 62, no. 3, pp. 972-981, 2015.

us
17 [74] H. Yuan, C. Perdoni, and B. He, "Relationship between speed and EEG activity during
18 imagined and executed hand movements," (in eng), J Neural Eng, vol. 7, no. 2, p. 26001,
19
Apr 2010.
20
21 [75] A. Korik, R. Sosnik, N. Siddique, and D. Coyle, "Imagined 3D hand movement trajectory
22 decoding from sensorimotor EEG rhythms," in Systems, Man, and Cybernetics (SMC),
23
24
25
26
27
28
[76]

[77]
an
2016 IEEE International Conference on, 2016, pp. 004591-004596: IEEE.
R. Abiri, G. Heise, F. Schwartz, and X. Zhao, "EEG-based control of a unidimensional
computer cursor using imagined body kinematics," in Biomedical Engineering Society
Annual Meeting (BMES 2015), 2015.
R. Abiri, X. Zhao, G. Heise, Y. Jiang, and F. Abiri, "Brain computer interface for gesture
dM
29 control of a social robot: An offline study," in Electrical Engineering (ICEE), 2017
30 Iranian Conference on, 2017, pp. 113-117: IEEE.
31 [78] R. Abiri, S. Borhani, X. Zhao, and Y. Jiang, "Real-time brain machine interaction via
32 social robot gesture control," in ASME 2017 Dynamic Systems and Control Conference,
33 2017, pp. V001T37A002-V001T37A002: American Society of Mechanical Engineers.
34
[79] R. Abiri, j. Kilmarx, S. Borhani, X. Zhao, and Y. Jiang, "A Brain-Machine Interface for a
35
36
Sequence Movement Control of a Robotic Arm," in Society for Neuroscience (SfN 2017),
37 2017.
38 [80] R. Abiri, J. Kilmarx, M. Raji, and X. Zhao, "Planar Control of a Quadcopter Using a
pte

39 Zero-Training Brain Machine Interface Platform," in Biomedical Engineering Society


40 Annual Meeting (BMES 2016), 2016.
41 [81] S. Borhani, R. Abiri, X. Zhao, and Y. Jiang, "A Transfer Learning Approach towards
42
Zero-training BCI for EEG-Based Two Dimensional Cursor Control," in Society for
43
44 Neuroscience (SfN 2017), 2017.
45 [82] D. Kapgate and D. Kalbande, "A Review on Visual Brain Computer Interface," in
46 Advancements of Medical Electronics: Springer, 2015, pp. 193-206.
ce

47 [83] S. Gao, Y. Wang, X. Gao, and B. Hong, "Visual and auditory brain–computer interfaces,"
48 IEEE Transactions on Biomedical Engineering, vol. 61, no. 5, pp. 1436-1447, 2014.
49 [84] F. Nijboer et al., "An auditory brain–computer interface (BCI)," Journal of neuroscience
50
51
methods, vol. 167, no. 1, pp. 43-50, 2008.
52 [85] L. Yao, X. Sheng, D. Zhang, N. Jiang, D. Farina, and X. Zhu, "A BCI System Based on
Ac

53 Somatosensory Attentional Orientation," IEEE Transactions on Neural Systems and


54 Rehabilitation Engineering, vol. 25, no. 1, pp. 81-90, 2017.
55
56
57
58 31
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 32 of 43

1
2
3 [86] M. Fabiani, G. Gratton, D. Karis, and E. Donchin, "Definition, identification, and
4

pt
5
reliability of measurement of the P300 component of the event-related brain potential,"
6 Advances in psychophysiology, vol. 2, no. S 1, p. 78, 1987.
7 [87] J. Polich, "Updating P300: an integrative theory of P3a and P3b," Clinical
8 neurophysiology, vol. 118, no. 10, pp. 2128-2148, 2007.
9 [88] L. A. Farwell and E. Donchin, "Talking off the top of your head: toward a mental

cri
10 prosthesis utilizing event-related brain potentials," Electroencephalography and clinical
11
Neurophysiology, vol. 70, no. 6, pp. 510-523, 1988.
12
13
[89] S. Halder et al., "Prediction of P300 BCI aptitude in severe motor impairment," PloS one,
14 vol. 8, no. 10, p. e76148, 2013.
15 [90] R. Fazel-Rezai, B. Z. Allison, C. Guger, E. W. Sellers, S. C. Kleih, and A. Kübler, "P300
16 brain computer interface: current challenges and emerging trends," Frontiers in

us
17 neuroengineering, vol. 5, 2012.
18 [91] L. M. McCane et al., "Brain-computer interface (BCI) evaluation in people with
19
amyotrophic lateral sclerosis," Amyotrophic Lateral Sclerosis and Frontotemporal
20
21 Degeneration, vol. 15, no. 3-4, pp. 207-215, 2014.
22 [92] P. Cipresso et al., "The use of P300‐ based BCIs in amyotrophic lateral sclerosis: from
23
24
25
26
27
28
an
augmentative and alternative communication to cognitive assessment," Brain and
behavior, vol. 2, no. 4, pp. 479-498, 2012.
[93] S. Sutton, P. Tueting, J. Zubin, and E. R. John, "Information delivery and the sensory
evoked potential," Science, vol. 155, no. 3768, pp. 1436-1439, 1967.
[94] E. Donchin, K. M. Spencer, and R. Wijesinghe, "The mental prosthesis: assessing the
dM
29 speed of a P300-based brain-computer interface," Rehabilitation Engineering, IEEE
30 Transactions on, vol. 8, no. 2, pp. 174-179, 2000.
31 [95] F. Piccione et al., "P300-based brain computer interface: reliability and performance in
32 healthy and paralysed participants," Clinical neurophysiology, vol. 117, no. 3, pp. 531-
33 537, 2006.
34
[96] D. J. Krusienski et al., "A comparison of classification techniques for the P300 Speller,"
35
36
Journal of neural engineering, vol. 3, no. 4, p. 299, 2006.
37 [97] L. Citi, R. Poli, C. Cinel, and F. Sepulveda, "P300-based BCI mouse with genetically-
38 optimized analogue control," IEEE transactions on neural systems and rehabilitation
pte

39 engineering, vol. 16, no. 1, pp. 51-61, 2008.


40 [98] S. Silvoni et al., "P300-based brain-computer interface communication: evaluation and
41 follow-up in amyotrophic lateral sclerosis," Frontiers in neuroscience, vol. 3, p. 1, 2009.
42
[99] M. Marchetti, F. Piccione, S. Silvoni, and K. Priftis, "Exogenous and endogenous
43
44 orienting of visuospatial attention in P300-guided brain computer interfaces: A pilot
45 study on healthy participants," Clinical Neurophysiology, vol. 123, no. 4, pp. 774-779,
46 2012.
ce

47 [100] M. Marchetti, F. Piccione, S. Silvoni, L. Gamberini, and K. Priftis, "Covert visuospatial


48 attention orienting in a brain-computer interface for amyotrophic lateral sclerosis
49 patients," Neurorehabilitation and neural repair, vol. 27, no. 5, pp. 430-438, 2013.
50
51
[101] S. Silvoni, M. Cavinato, C. Volpato, C. A. Ruf, N. Birbaumer, and F. Piccione,
52 "Amyotrophic lateral sclerosis progression and stability of brain-computer interface
Ac

53 communication," Amyotrophic lateral sclerosis and frontotemporal degeneration, vol.


54 14, no. 5-6, pp. 390-396, 2013.
55
56
57
58 32
59
60
Page 33 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [102] D. J. Krusienski, E. W. Sellers, D. J. McFarland, T. M. Vaughan, and J. R. Wolpaw,
4

pt
5
"Toward enhanced P300 speller performance," Journal of neuroscience methods, vol.
6 167, no. 1, pp. 15-21, 2008.
7 [103] C. J. Bell, P. Shenoy, R. Chalodhorn, and R. P. Rao, "Control of a humanoid robot by a
8 noninvasive brain–computer interface in humans," Journal of neural engineering, vol. 5,
9 no. 2, p. 214, 2008.

cri
10 [104] G. Edlinger, C. Holzner, C. Groenegress, C. Guger, and M. Slater, "Goal-oriented control
11
with brain-computer interface," in International Conference on Foundations of
12
13
Augmented Cognition, 2009, pp. 732-740: Springer.
14 [105] W.-d. Chen et al., "A P300 based online brain-computer interface system for virtual hand
15 control," (in English), Journal of Zhejiang University SCIENCE C, vol. 11, no. 8, pp.
16 587-597, 2010/08/01 2010.

us
17 [106] R. Fazel-Rezai and K. Abhari, "A region-based P300 speller for brain-computer
18 interface," Canadian Journal of Electrical and Computer Engineering, vol. 34, no. 3, pp.
19
81-85, 2009.
20
21 [107] G. Townsend et al., "A novel P300-based brain–computer interface stimulus presentation
22 paradigm: moving beyond rows and columns," Clinical Neurophysiology, vol. 121, no. 7,
23
24
25
26
27
28
pp. 1109-1120, 2010.

an
[108] M. Moghadamfalahi, U. Orhan, M. Akcakaya, H. Nezamfar, M. Fried-Oken, and D.
Erdogmus, "Language-model assisted brain computer interface for typing: a comparison
of matrix and rapid serial visual presentation," IEEE Transactions on Neural Systems and
Rehabilitation Engineering, vol. 23, no. 5, pp. 910-920, 2015.
dM
29 [109] U. Hoffmann, J.-M. Vesin, T. Ebrahimi, and K. Diserens, "An efficient P300-based
30 brain–computer interface for disabled subjects," Journal of Neuroscience methods, vol.
31 167, no. 1, pp. 115-125, 2008.
32 [110] I. Iturrate, J. M. Antelis, A. Kubler, and J. Minguez, "A noninvasive brain-actuated
33 wheelchair based on a P300 neurophysiological protocol and automated navigation,"
34
IEEE Transactions on Robotics, vol. 25, no. 3, pp. 614-627, 2009.
35
36
[111] F. Nijboer et al., "A P300-based brain–computer interface for people with amyotrophic
37 lateral sclerosis," Clinical neurophysiology, vol. 119, no. 8, pp. 1909-1916, 2008.
38 [112] E. W. Sellers, T. M. Vaughan, and J. R. Wolpaw, "A brain-computer interface for long-
pte

39 term independent home use," Amyotrophic lateral sclerosis, vol. 11, no. 5, pp. 449-455,
40 2010.
41 [113] E. W. Sellers, D. B. Ryan, and C. K. Hauser, "Noninvasive brain-computer interface
42
enables communication after brainstem stroke," Science translational medicine, vol. 6,
43
44 no. 257, pp. 257re7-257re7, 2014.
45 [114] E. A. Aydın, Ö. F. Bay, and İ. Güler, "Implementation of an Embedded Web Server
46 Application for Wireless Control of Brain Computer Interface Based Home
ce

47 Environments," Journal of medical systems, vol. 40, no. 1, pp. 1-10, 2016.
48 [115] S. He et al., "A P300-Based Threshold-Free Brain Switch and Its Application in
49 Wheelchair Control," IEEE Transactions on Neural Systems and Rehabilitation
50
51
Engineering, vol. 25, no. 6, pp. 715-725, 2017.
52 [116] S. Amiri, A. Rabbi, L. Azinfar, and R. Fazel-Rezai, "A review of P300, SSVEP, and
Ac

53 hybrid P300/SSVEP brain-computer interface systems," Brain-Computer Interface


54 Systems—Recent Progress and Future Prospects, 2013.
55
56
57
58 33
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 34 of 43

1
2
3 [117] F.-B. Vialatte, M. Maurice, J. Dauwels, and A. Cichocki, "Steady-state visually evoked
4

pt
5
potentials: focus on essential paradigms and future perspectives," Progress in
6 neurobiology, vol. 90, no. 4, pp. 418-438, 2010.
7 [118] C. S. Herrmann, "Human EEG responses to 1–100 Hz flicker: resonance phenomena in
8 visual cortex and their potential correlation to cognitive phenomena," Experimental brain
9 research, vol. 137, no. 3-4, pp. 346-353, 2001.

cri
10 [119] M. H. Chang, H. J. Baek, S. M. Lee, and K. S. Park, "An amplitude-modulated visual
11
stimulation for reducing eye fatigue in SSVEP-based brain–computer interfaces,"
12
13
Clinical Neurophysiology, vol. 125, no. 7, pp. 1380-1391, 2014.
14 [120] G. G. Molina and V. Mihajlovic, "Spatial filters to detect steady-state visual evoked
15 potentials elicited by high frequency stimulation: BCI application," Biomedizinische
16 Technik/Biomedical Engineering, vol. 55, no. 3, pp. 173-182, 2010.

us
17 [121] S. M. T. Müller, P. F. Diez, T. F. Bastos-Filho, M. Sarcinelli-Filho, V. Mut, and E.
18 Laciar, "SSVEP-BCI implementation for 37–40 Hz frequency range," in Engineering in
19
Medicine and Biology Society, EMBC, 2011 Annual International Conference of the
20
21 IEEE, 2011, pp. 6352-6355: IEEE.
22 [122] I. Volosyak, D. Valbuena, T. Luth, T. Malechka, and A. Graser, "BCI demographics II:
23
24
25
26
27
28
239, 2011. an
How many (and what kinds of) people can use a high-frequency SSVEP BCI?," IEEE
Transactions on Neural Systems and Rehabilitation Engineering, vol. 19, no. 3, pp. 232-

[123] B.-K. Min, S. Dähne, M.-H. Ahn, Y.-K. Noh, and K.-R. Müller, "Decoding of top-down
cognitive processing for SSVEP-controlled BMI," Scientific Reports, vol. 6, 2016.
dM
29 [124] G. R. Muller-Putz, R. Scherer, C. Neuper, and G. Pfurtscheller, "Steady-state
30 somatosensory evoked potentials: suitable brain signals for brain-computer interfaces?,"
31 IEEE transactions on neural systems and rehabilitation engineering, vol. 14, no. 1, pp.
32 30-37, 2006.
33 [125] C. Pokorny, C. Breitwieser, and G. R. Müller-Putz, "The role of transient target stimuli in
34
a steady-state somatosensory evoked potential-based brain–computer interface setup,"
35
36
Frontiers in neuroscience, vol. 10, p. 152, 2016.
37 [126] G. R. Muller-Putz and G. Pfurtscheller, "Control of an electrical prosthesis with an
38 SSVEP-based BCI," (in eng), IEEE Trans Biomed Eng, vol. 55, no. 1, pp. 361-4, Jan
pte

39 2008.
40 [127] Q. Liu, K. Chen, Q. Ai, and S. Q. Xie, "Review: recent development of signal processing
41 algorithms for SSVEP-based brain computer interfaces," Journal of Medical and
42
Biological Engineering, vol. 34, no. 4, pp. 299-309, 2013.
43
44 [128] M. Bryan et al., "An adaptive brain-computer interface for humanoid robot control," in
45 Humanoid Robots (Humanoids), 2011 11th IEEE-RAS International Conference on,
46 2011, pp. 199-204: IEEE.
ce

47 [129] G. Li and D. Zhang, "Brain-Computer Interface Controlled Cyborg: Establishing a


48 Functional Information Transfer Pathway from Human Brain to Cockroach Brain," PloS
49 one, vol. 11, no. 3, p. e0150667, 2016.
50
51
[130] F. Gembler, P. Stawicki, and I. Volosyak, "Autonomous Parameter Adjustment for
52 SSVEP-Based BCIs with a Novel BCI Wizard," Frontiers in neuroscience, vol. 9, 2015.
Ac

53 [131] X. Chen, Y. Wang, M. Nakanishi, X. Gao, T.-P. Jung, and S. Gao, "High-speed spelling
54 with a noninvasive brain–computer interface," Proceedings of the National Academy of
55 Sciences, vol. 112, no. 44, pp. E6058-E6067, 2015.
56
57
58 34
59
60
Page 35 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [132] Y.-J. Chen, S.-C. Chen, I. A. Zaeni, C.-M. Wu, A. J. Tickle, and P.-J. Chen, "The
4

pt
5
SSVEP-Based BCI Text Input System Using Entropy Encoding Algorithm,"
6 Mathematical Problems in Engineering, vol. 2015, 2015.
7 [133] N.-S. Kwak, K.-R. Müller, and S.-W. Lee, "A lower limb exoskeleton control system
8 based on steady state visual evoked potentials," Journal of neural engineering, vol. 12,
9 no. 5, p. 056009, 2015.

cri
10 [134] Z. Cao and C.-T. Lin, "Inherent fuzzy entropy for the improvement of EEG complexity
11
evaluation," IEEE Transactions on Fuzzy Systems, vol. 26, no. 2, pp. 1032-1035, 2018.
12
13
[135] H. J. Hwang et al., "Clinical feasibility of brain‐ computer interface based on steady‐
14 state visual evoked potential in patients with locked‐ in syndrome: Case studies,"
15 Psychophysiology, vol. 54, no. 3, pp. 444-451, 2017.
16 [136] J. Chen, D. Zhang, A. K. Engel, Q. Gong, and A. Maye, "Application of a single-flicker

us
17 online SSVEP BCI for spatial navigation," PloS one, vol. 12, no. 5, p. e0178385, 2017.
18 [137] G. Pfurtscheller, T. Solis-Escalante, R. Ortner, P. Linortner, and G. R. Muller-Putz, "Self-
19
paced operation of an SSVEP-Based orthosis with and without an imagery-based “brain
20
21 switch:” a feasibility study towards a hybrid BCI," Neural Systems and Rehabilitation
22 Engineering, IEEE Transactions on, vol. 18, no. 4, pp. 409-414, 2010.
23
24
25
26
27
28
an
[138] R. Chavarriaga, A. Sobolewski, and J. d. R. Millán, "Errare machinale est: the use of
error-related potentials in brain-machine interfaces," Using Neurophysiological Signals
that Reflect Cognitive or Affective State, p. 53, 2015.
[139] P. W. Ferrez and J. d. R. Millán, "Error-related EEG potentials generated during
simulated brain–computer interaction," IEEE transactions on biomedical engineering,
dM
29 vol. 55, no. 3, pp. 923-929, 2008.
30 [140] R. Chavarriaga and J. d. R. Millán, "Learning from EEG error-related potentials in
31 noninvasive brain-computer interfaces," IEEE transactions on neural systems and
32 rehabilitation engineering, vol. 18, no. 4, pp. 381-388, 2010.
33 [141] J. Wright, V. G. Macefield, A. van Schaik, and J. C. Tapson, "A Review of Control
34
Strategies in Closed-Loop Neuroprosthetic Systems," Frontiers in Neuroscience, vol. 10,
35
36
2016.
37 [142] X. Artusi, I. K. Niazi, M.-F. Lucas, and D. Farina, "Performance of a simulated adaptive
38 BCI based on experimental classification of movement-related and error potentials,"
pte

39 IEEE Journal on Emerging and Selected Topics in Circuits and Systems, vol. 1, no. 4, pp.
40 480-488, 2011.
41 [143] I. Iturrate, L. Montesano, and J. Minguez, "Shared-control brain-computer interface for a
42
two dimensional reaching task using EEG error-related potentials," in 2013 35th Annual
43
44 International Conference of the IEEE Engineering in Medicine and Biology Society
45 (EMBC), 2013, pp. 5258-5262: IEEE.
46 [144] I. Iturrate, R. Chavarriaga, L. Montesano, J. Minguez, and J. d. R. Millán, "Teaching
ce

47 brain-machine interfaces as an alternative paradigm to neuroprosthetics control,"


48 Scientific reports, vol. 5, 2015.
49 [145] I. Iturrate, L. Montesano, and J. Minguez, "Robot Reinforcement Learning using EEG-
50
51
based reward signals," in Robotics and Automation (ICRA), 2010 IEEE International
52 Conference on, 2010, pp. 4822-4829: IEEE.
Ac

53 [146] T. Tsoneva, J. Bieger, and G. Garcia-Molina, "Towards error-free interaction," in 2010


54 Annual International Conference of the IEEE Engineering in Medicine and Biology,
55 2010, pp. 5799-5802: IEEE.
56
57
58 35
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 36 of 43

1
2
3 [147] M. K. Goel, R. Chavarriaga, and J. d. R. Millán, "Cortical current density vs. surface
4

pt
5
EEG for event-related potential-based Brain-Computer interface," in Neural Engineering
6 (NER), 2011 5th International IEEE/EMBS Conference on, 2011, pp. 430-433: IEEE.
7 [148] H. Zhang, R. Chavarriaga, M. K. Goel, L. Gheorghe, and J. d. R. Millán, "Improved
8 recognition of error related potentials through the use of brain connectivity features," in
9 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology

cri
10 Society, 2012, pp. 6740-6743: Ieee.
11
[149] I. Iturrate, R. Chavarriaga, L. Montesano, J. Minguez, and J. d. R. Millán, "Latency
12
13
correction of error potentials between different experiments reduces calibration time for
14 single-trial classification," in 2012 Annual International Conference of the IEEE
15 Engineering in Medicine and Biology Society, 2012, pp. 3288-3291: IEEE.
16 [150] I. Iturrate, L. Montesano, and J. Minguez, "Task-dependent signal variations in EEG

us
17 error-related potentials for brain–computer interfaces," Journal of neural engineering,
18 vol. 10, no. 2, p. 026024, 2013.
19
[151] J. Omedes, I. Iturrate, R. Chavarriaga, and L. Montesano, "Asynchronous Decoding of
20
21 Error Potentials During the Monitoring of a Reaching Task," in Systems, Man, and
22 Cybernetics (SMC), 2015 IEEE International Conference on, 2015, pp. 3116-3121:
23
24
25
26
27
28
IEEE.

an
[152] A. Kreilinger, C. Neuper, and G. R. Müller-Putz, "Error potential detection during
continuous movement of an artificial arm controlled by brain–computer interface,"
Medical & biological engineering & computing, vol. 50, no. 3, pp. 223-230, 2012.
[153] J. Omedes, I. Iturrate, L. Montesano, and J. Minguez, "Using frequency-domain features
dM
29 for the generalization of EEG error-related potentials among different tasks," in 2013
30 35th Annual International Conference of the IEEE Engineering in Medicine and Biology
31 Society (EMBC), 2013, pp. 5263-5266: IEEE.
32 [154] L. George and A. Lécuyer, "An overview of research on" passive" brain-computer
33 interfaces for implicit human-computer interaction," in International Conference on
34
Applied Bionics and Biomechanics ICABB 2010-Workshop W1" Brain-Computer
35
36
Interfacing and Virtual Reality", 2010.
37 [155] T. O. Zander and C. Kothe, "Towards passive brain–computer interfaces: applying brain–
38 computer interface technology to human–machine systems in general," Journal of neural
pte

39 engineering, vol. 8, no. 2, p. 025005, 2011.


40 [156] S. Amiri, R. Fazel-Rezai, and V. Asadpour, "A review of hybrid brain-computer interface
41 systems," Advances in Human-Computer Interaction, vol. 2013, p. 1, 2013.
42
[157] G. Pfurtscheller et al., "The hybrid BCI," Frontiers in neuroscience, vol. 4, p. 3, 2010.
43
44 [158] H. Banville and T. Falk, "Recent advances and open challenges in hybrid brain-computer
45 interfacing: a technological review of non-invasive human research," Brain-Computer
46 Interfaces, vol. 3, no. 1, pp. 9-46, 2016.
ce

47 [159] U. Chaudhary, B. Xia, S. Silvoni, L. G. Cohen, and N. Birbaumer, "Brain–computer


48 interface–based communication in the completely locked-in state," PLoS biology, vol. 15,
49 no. 1, p. e1002593, 2017.
50
51
[160] T. Luth, D. Ojdanic, O. Friman, O. Prenzel, and A. Graser, "Low level control in a semi-
52 autonomous rehabilitation robotic system via a brain-computer interface," in
Ac

53 Rehabilitation Robotics, 2007. ICORR 2007. IEEE 10th International Conference on,
54 2007, pp. 721-728: IEEE.
55
56
57
58 36
59
60
Page 37 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [161] Y. Li et al., "An EEG-based BCI system for 2-D cursor control by combining Mu/Beta
4

pt
5
rhythm and P300 potential," Biomedical Engineering, IEEE Transactions on, vol. 57, no.
6 10, pp. 2495-2505, 2010.
7 [162] L. Bi, J. Lian, K. Jie, R. Lai, and Y. Liu, "A speed and direction-based cursor control
8 system with P300 and SSVEP," Biomedical Signal Processing and Control, vol. 14, pp.
9 126-133, 2014.

cri
10 [163] B. Z. Allison, C. Brunner, C. Altstätter, I. C. Wagner, S. Grissmann, and C. Neuper, "A
11
hybrid ERD/SSVEP BCI for continuous simultaneous two dimensional cursor control,"
12
13
Journal of neuroscience methods, vol. 209, no. 2, pp. 299-307, 2012.
14 [164] F. Duan, D. Lin, W. Li, and Z. Zhang, "Design of a Multimodal EEG-based Hybrid BCI
15 System with Visual Servo Module," IEEE Transactions on Autonomous Mental
16 Development, vol. 7, no. 4, pp. 332-341, 2015.

us
17 [165] T. Yu et al., "Enhanced motor imagery training using a hybrid BCI with feedback," IEEE
18 Transactions on Biomedical Engineering, vol. 62, no. 7, pp. 1706-1717, 2015.
19
[166] B. Hyung Kim, M. Kim, and S. Joevaluated, "Quadcopter flight control using a low-cost
20
21 hybrid interface with EEG-based classification and eye tracking," Computers in biology
22 and medicine, 2014.
23
24
25
26
27
28
an
[167] M. Kim, B. H. Kim, and S. Jo, "Quantitative evaluation of a low-cost noninvasive hybrid
interface based on EEG and eye movement," IEEE Transactions on Neural Systems and
Rehabilitation Engineering, vol. 23, no. 2, pp. 159-168, 2015.
[168] K.-S. Hong and M. J. Khan, "Hybrid Brain–Computer interface Techniques for improved
Classification Accuracy and increased Number of Commands: A Review," Frontiers in
dM
29 neurorobotics, vol. 11, 2017.
30 [169] I. Choi, I. Rhiu, Y. Lee, M. H. Yun, and C. S. Nam, "A systematic review of hybrid
31 brain-computer interfaces: Taxonomy and usability perspectives," PloS one, vol. 12, no.
32 4, p. e0176674, 2017.
33 [170] P. Horki, T. Solis-Escalante, C. Neuper, and G. Müller-Putz, "Combined motor imagery
34
and SSVEP based BCI control of a 2 DoF artificial upper limb," Medical & biological
35
36
engineering & computing, vol. 49, no. 5, pp. 567-577, 2011.
37 [171] J. Ma, Y. Zhang, A. Cichocki, and F. Matsuno, "A Novel EOG/EEG Hybrid Human–
38 Machine Interface Adopting Eye Movements and ERPs: Application to Robot Control,"
pte

39 IEEE Transactions on Biomedical Engineering, vol. 62, no. 3, pp. 876-889, 2015.
40 [172] M. J. Khan, K.-S. Hong, N. Naseer, and M. R. Bhutta, "Hybrid EEG-NIRS based BCI for
41 quadcopter control," in Society of Instrument and Control Engineers of Japan (SICE),
42
2015 54th Annual Conference of the, 2015, pp. 1177-1182: IEEE.
43
44 [173] T. Malechka, T. Tetzel, U. Krebs, D. Feuser, and A. Graeser, "sBCI-Headset—Wearable
45 and Modular Device for Hybrid Brain-Computer Interface," Micromachines, vol. 6, no. 3,
46 pp. 291-311, 2015.
ce

47 [174] A. R. Hunt and A. Kingstone, "Covert and overt voluntary attention: linked or
48 independent?," Cognitive Brain Research, vol. 18, no. 1, pp. 102-105, 2003.
49 [175] S. P. Kelly, J. J. Foxe, G. Newman, and J. A. Edelman, "Prepare for conflict: EEG
50
51
correlates of the anticipation of target competition during overt and covert shifts of visual
52 attention," European Journal of Neuroscience, vol. 31, no. 9, pp. 1690-1700, 2010.
Ac

53 [176] S. Kelly, E. Lalor, R. Reilly, and J. Foxe, "Independent brain computer interface control
54 using visual spatial attention-dependent modulations of parieto-occipital alpha," in
55
56
57
58 37
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 38 of 43

1
2
3 Neural Engineering, 2005. Conference Proceedings. 2nd International IEEE EMBS
4

pt
5
Conference on, 2005, pp. 667-670: IEEE.
6 [177] L. Tonin, R. Leeb, and J. del R Millán, "Time-dependent approach for single trial
7 classification of covert visuospatial attention," Journal of neural engineering, vol. 9, no.
8 4, p. 045011, 2012.
9 [178] L. Tonin, R. Leeb, A. Sobolewski, and J. del R Millán, "An online EEG BCI based on

cri
10 covert visuospatial attention in absence of exogenous stimulation," Journal of neural
11
engineering, vol. 10, no. 5, p. 056007, 2013.
12
13
[179] M. S. Treder, A. Bahramisharif, N. M. Schmidt, M. A. van Gerven, and B. Blankertz,
14 "Brain-computer interfacing using modulations of alpha activity induced by covert shifts
15 of attention," Journal of neuroengineering and rehabilitation, vol. 8, no. 1, p. 24, 2011.
16 [180] O. Bai et al., "Prediction of human voluntary movement before it occurs," Clinical

us
17 Neurophysiology, vol. 122, no. 2, pp. 364-372, 2011.
18 [181] A. Muralidharan, J. Chae, and D. Taylor, "Extracting attempted hand movements from
19
EEGs in people with complete hand paralysis following stroke," (in English), Frontiers
20
21 in Neuroscience, Original Research vol. 5, 2011-March-25 2011.
22 [182] L. Yang, H. Leung, M. Plank, J. Snider, and H. Poizner, "EEG activity during movement
23
24
25
26
27
28
[183]
no. 1, pp. 22-28, 2015. an
planning encodes upcoming peak speed and acceleration and improves the accuracy in
predicting hand kinematics," IEEE journal of biomedical and health informatics, vol. 19,

A. Frisoli et al., "A New Gaze-BCI-Driven Control of an Upper Limb Exoskeleton for
Rehabilitation in Real-World Tasks," Systems, Man, and Cybernetics, Part C:
dM
29 Applications and Reviews, IEEE Transactions on, vol. 42, no. 6, pp. 1169-1179, 2012.
30 [184] E. Lew, R. Chavarriaga, S. Silvoni, and J. d. R. Millán, "Detection of Self-Paced
31 Reaching Movement Intention from EEG Signals," (in English), Frontiers in
32 Neuroengineering, Original Research vol. 5, 2012-July-12 2012.
33 [185] O. Bai, P. Lin, S. Vorbach, J. Li, S. Furlani, and M. Hallett, "Exploration of
34
computational methods for classification of movement intention during human voluntary
35
36
movement from single trial EEG," Clinical Neurophysiology, vol. 118, no. 12, pp. 2637-
37 2655, 2007.
38 [186] J. Zhou, J. Yao, J. Deng, and J. P. Dewald, "EEG-based classification for elbow versus
pte

39 shoulder torque intentions involving stroke subjects," (in eng), Comput Biol Med, vol. 39,
40 no. 5, pp. 443-52, May 2009.
41 [187] J. Ibáñez et al., "Detection of the onset of upper-limb movements based on the combined
42
analysis of changes in the sensorimotor rhythms and slow cortical potentials," Journal of
43
44 neural engineering, vol. 11, no. 5, p. 056009, 2014.
45 [188] E. W. Sellers and E. Donchin, "A P300-based brain–computer interface: initial tests by
46 ALS patients," Clinical neurophysiology, vol. 117, no. 3, pp. 538-548, 2006.
ce

47 [189] T. W. Picton, M. S. John, A. Dimitrijevic, and D. Purcell, "Human auditory steady-state


48 responses: Respuestas auditivas de estado estable en humanos," International journal of
49 audiology, vol. 42, no. 4, pp. 177-219, 2003.
50
51
[190] F. Ferracuti, A. Freddi, S. Iarlori, S. Longhi, and P. Peretti, "Auditory Paradigm for a
52 P300 BCI system using Spatial Hearing," in 2013 IEEE/RSJ International Conference on
Ac

53 Intelligent Robots and Systems, 2013, pp. 871-876: IEEE.


54 [191] K. Hamada, H. Mori, H. Shinoda, and T. M. Rutkowski, "Airborne ultrasonic tactile
55 display brain-computer interface paradigm," arXiv preprint arXiv:1404.4184, 2014.
56
57
58 38
59
60
Page 39 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [192] A.-M. Brouwer and J. B. Van Erp, "A tactile P300 brain-computer interface," Frontiers
4

pt
5
in neuroscience, vol. 4, p. 19, 2010.
6 [193] C. Guger et al., "Complete locked-in and locked-in patients: command following
7 assessment and communication with vibro-tactile P300 and motor imagery brain-
8 computer interface tools," Frontiers in neuroscience, vol. 11, p. 251, 2017.
9 [194] Z. R. Lugo et al., "A vibrotactile p300-based brain–computer interface for consciousness

cri
10 detection and communication," Clinical EEG and neuroscience, vol. 45, no. 1, pp. 14-21,
11
2014.
12
13
[195] A. Furdea et al., "A new (semantic) reflexive brain–computer interface: In search for a
14 suitable classifier," Journal of neuroscience methods, vol. 203, no. 1, pp. 233-240, 2012.
15 [196] C. A. Ruf et al., "Semantic classical conditioning and brain-computer interface control:
16 encoding of affirmative and negative thinking," Frontiers in neuroscience, vol. 7, p. 23,

us
17 2013.
18 [197] N. Birbaumer, F. Piccione, S. Silvoni, and M. Wildgruber, "Ideomotor silence: the case
19
of complete paralysis and brain–computer interfaces (BCI)," Psychological research, vol.
20
21 76, no. 2, pp. 183-191, 2012.
22 [198] G. Gallegos-Ayala, A. Furdea, K. Takano, C. A. Ruf, H. Flor, and N. Birbaumer, "Brain
23
24
25
26
27
28
an
communication in a completely locked-in patient using bedside near-infrared
spectroscopy," Neurology, vol. 82, no. 21, pp. 1930-1932, 2014.
[199] U. Chaudhary, N. Birbaumer, and A. Ramos-Murguialday, "Brain–computer interfaces
for communication and rehabilitation," Nature Reviews Neurology, vol. 12, no. 9, p. 513,
2016.
dM
29 [200] Y. Wang and S. Makeig, "Predicting intended movement direction using EEG from
30 human posterior parietal cortex," in Foundations of Augmented Cognition.
31 Neuroergonomics and Operational Neuroscience: Springer, 2009, pp. 437-446.
32 [201] P. S. Hammon, S. Makeig, H. Poizner, E. Todorov, and V. R. de Sa, "Predicting
33 Reaching Targets from Human EEG," Signal Processing Magazine, IEEE, vol. 25, no. 1,
34
pp. 69-77, 2008.
35
36
[202] A. Muralidharan, J. Chae, and D. M. Taylor, "Early detection of hand movements from
37 electroencephalograms for stroke therapy applications," (in eng), J Neural Eng, vol. 8,
38 no. 4, p. 046003, Aug 2011.
pte

39 [203] N. A. Bhagat et al., "Detecting movement intent from scalp EEG in a novel upper limb
40 robotic rehabilitation system for stroke," in Engineering in Medicine and Biology Society
41 (EMBC), 2014 36th Annual International Conference of the IEEE, 2014, pp. 4127-4130:
42
IEEE.
43
44 [204] R. Xu, N. Jiang, C. Lin, N. Mrachacz-Kersting, K. Dremstrup, and D. Farina, "Enhanced
45 low-latency detection of motor intention from EEG for closed-loop brain-computer
46 interface applications," IEEE Transactions on Biomedical Engineering, vol. 61, no. 2, pp.
ce

47 288-296, 2014.
48 [205] S. Halder et al., "Prediction of auditory and visual p300 brain-computer interface
49 aptitude," PloS one, vol. 8, no. 2, p. e53513, 2013.
50
51
[206] I. Käthner, C. A. Ruf, E. Pasqualotto, C. Braun, N. Birbaumer, and S. Halder, "A portable
52 auditory P300 brain–computer interface with directional cues," Clinical neurophysiology,
Ac

53 vol. 124, no. 2, pp. 327-338, 2013.


54 [207] D. S. Klobassa et al., "Toward a high-throughput auditory P300-based brain–computer
55 interface," Clinical Neurophysiology, vol. 120, no. 7, pp. 1252-1261, 2009.
56
57
58 39
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 40 of 43

1
2
3 [208] G. Placidi, D. Avola, A. Petracca, F. Sgallari, and M. Spezialetti, "Basis for the
4

pt
5
implementation of an EEG-based single-trial binary brain computer interface through the
6 disgust produced by remembering unpleasant odors," Neurocomputing, vol. 160, pp. 308-
7 318, 2015.
8 [209] Y. J. Kim et al., "A study on a robot arm driven by three-dimensional trajectories
9 predicted from non-invasive neural signals," Biomedical engineering online, vol. 14, no.

cri
10 1, p. 1, 2015.
11
[210] A. Úbeda, E. Hortal, J. Alarcón, R. Salazar-Varas, A. Sánchez, and J. M. Azorín, "Online
12
13
detection of horizontal hand movements from low frequency EEG components," in 2015
14 7th International IEEE/EMBS Conference on Neural Engineering (NER), 2015, pp. 214-
15 217: IEEE.
16 [211] K. Kiguchi, T. D. Lalitharatne, and Y. Hayashi, "Estimation of Forearm

us
17 Supination/Pronation Motion Based on EEG Signals to Control an Artificial Arm,"
18 Journal of Advanced Mechanical Design, Systems, and Manufacturing, vol. 7, no. 1, pp.
19
74-81, 2013.
20
21 [212] C. Breitwieser, C. Pokorny, C. Neuper, and G. R. Muller-Putz, "Somatosensory evoked
22 potentials elicited by stimulating two fingers from one hand--usable for BCI?," (in eng),
23
24
25
26
27
28
an
Conf Proc IEEE Eng Med Biol Soc, vol. 2011, pp. 6373-6, 2011.
[213] M. van der Waal, M. Severens, J. Geuze, and P. Desain, "Introducing the tactile speller:
an ERP-based brain–computer interface for communication," Journal of Neural
Engineering, vol. 9, no. 4, p. 045002, 2012.
[214] E. Hortal et al., "SVM-based Brain–Machine Interface for controlling a robot arm
dM
29 through four mental tasks," Neurocomputing, vol. 151, pp. 116-121, 2015.
30 [215] N. Birbaumer, "Slow cortical potentials: their origin, meaning, and clinical use," ed:
31 Tilburg, The Netherlands: Tilburg Univ. Press, 1997, pp. 25-39.
32 [216] N. Birbaumer et al., "The thought translation device (TTD) for completely paralyzed
33 patients," IEEE Transactions on rehabilitation Engineering, vol. 8, no. 2, pp. 190-193,
34
2000.
35
36
[217] A. Kübler, B. Kotchoubey, J. Kaiser, J. R. Wolpaw, and N. Birbaumer, "Brain–computer
37 communication: Unlocking the locked in," Psychological bulletin, vol. 127, no. 3, p. 358,
38 2001.
pte

39 [218] T. Hinterberger, J. M. Houtkooper, and B. Kotchoubey, "Effects of feedback control on


40 slow cortical potentials and random events," in Parapsychological Association
41 Convention, 2004, pp. 39-50.
42
[219] D. Tkach, J. Reimer, and N. G. Hatsopoulos, "Observation-based learning for brain–
43
44 machine interfaces," Current Opinion in Neurobiology, vol. 18, no. 6, pp. 589-594, 12//
45 2008.
46 [220] H. Agashe and J. L. Contreras-Vidal, "Observation-based training for neuroprosthetic
ce

47 control of grasping by amputees," in Engineering in Medicine and Biology Society


48 (EMBC), 2014 36th Annual International Conference of the IEEE, 2014, pp. 3989-3992:
49 IEEE.
50
51
[221] H. A. Agashe and J. L. Contreras-Vidal, "Observation-based calibration of brain-machine
52 interfaces for grasping," in Neural Engineering (NER), 2013 6th International
Ac

53 IEEE/EMBS Conference on, 2013, pp. 1-4: IEEE.


54
55
56
57
58 40
59
60
Page 41 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [222] A. N. Belkacem, H. Hirose, N. Yoshimura, D. Shin, and Y. Koike, "Classification of four
4

pt
5
eye directions from EEG signals for eye-movement-based communication systems," J.
6 Med. Biol. Eng., 2013.
7 [223] R. Ramli, H. Arof, F. Ibrahim, M. Y. I. Idris, and A. Khairuddin, "Classification of eyelid
8 position and eyeball movement using EEG signals," Malaysian Journal of Computer
9 Science, vol. 28, no. 1, pp. 28-45, 2015.

cri
10 [224] A. N. Belkacem et al., "Real-time control of a video game using eye movements and two
11
temporal EEG sensors," Computational intelligence and neuroscience, vol. 2015, p. 1,
12
13
2015.
14 [225] C. Brunner et al., "BNCI Horizon 2020: towards a roadmap for the BCI community,"
15 Brain-computer interfaces, vol. 2, no. 1, pp. 1-10, 2015.
16 [226] E. M. Holz, L. Botrel, T. Kaufmann, and A. Kübler, "Long-term independent brain-

us
17 computer interface home use improves quality of life of a patient in the locked-in state: a
18 case study," Archives of physical medicine and rehabilitation, vol. 96, no. 3, pp. S16-S26,
19
2015.
20
21 [227] P. Wang, J. Lu, B. Zhang, and Z. Tang, "A review on transfer learning for brain-
22 computer interface classification," in 2015 5th International Conference on Information
23
24
25
26
27
28
an
Science and Technology (ICIST), 2015, pp. 315-322: IEEE.
[228] V. Jayaram, M. Alamgir, Y. Altun, B. Scholkopf, and M. Grosse-Wentrup, "Transfer
learning in brain-computer interfaces," IEEE Computational Intelligence Magazine, vol.
11, no. 1, pp. 20-31, 2016.
[229] F. Lotte, "Signal processing approaches to minimize or suppress calibration time in
dM
29 oscillatory activity-based brain–computer interfaces," Proceedings of the IEEE, vol. 103,
30 no. 6, pp. 871-890, 2015.
31 [230] N. R. Waytowich, J. Faller, J. O. Garcia, J. M. Vettel, and P. Sajda, "Unsupervised
32 adaptive transfer learning for Steady-State Visual Evoked Potential brain-computer
33 interfaces," in Systems, Man, and Cybernetics (SMC), 2016 IEEE International
34
Conference on, 2016, pp. 004135-004140: IEEE.
35
36
[231] A. Bashashati, M. Fatourechi, R. K. Ward, and G. E. Birch, "A survey of signal
37 processing algorithms in brain-computer interfaces based on electrical brain signals," (in
38 eng), J Neural Eng, vol. 4, no. 2, pp. R32-57, Jun 2007.
pte

39 [232] B. J. Edelman, B. Baxter, and B. He, "EEG Source Imaging Enhances the Decoding of
40 Complex Right-Hand Motor Imagery Tasks," IEEE Transactions on Biomedical
41 Engineering, vol. 63, no. 1, pp. 4-14, 2016.
42
[233] N. Tomida, T. Tanaka, S. Ono, M. Yamagishi, and H. Higashi, "Active data selection for
43
44 motor imagery EEG classification," IEEE Transactions on Biomedical Engineering, vol.
45 62, no. 2, pp. 458-467, 2015.
46 [234] M. Längkvist, L. Karlsson, and A. Loutfi, "A review of unsupervised feature learning and
ce

47 deep learning for time-series modeling," Pattern Recognition Letters, vol. 42, pp. 11-24,
48 2014.
49 [235] J. Schmidhuber, "Deep learning in neural networks: An overview," Neural Networks, vol.
50
51
61, pp. 85-117, 2015.
52 [236] I. Sturm, S. Lapuschkin, W. Samek, and K.-R. Müller, "Interpretable deep neural
Ac

53 networks for single-trial eeg classification," Journal of Neuroscience Methods, vol. 274,
54 pp. 141-145, 2016.
55
56
57
58 41
59
60
AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2 Page 42 of 43

1
2
3 [237] A. H. Marblestone, G. Wayne, and K. P. Kording, "Toward an integration of deep
4

pt
5
learning and neuroscience," Frontiers in computational neuroscience, vol. 10, 2016.
6 [238] S. Perdikis, R. Leeb, and J. d R Millán, "Context-aware adaptive spelling in motor
7 imagery BCI," Journal of neural engineering, vol. 13, no. 3, p. 036018, 2016.
8 [239] B. Dal Seno, M. Matteucci, and L. T. Mainardi, "The utility metric: a novel method to
9 assess the overall performance of discrete brain–computer interfaces," IEEE Transactions

cri
10 on Neural Systems and Rehabilitation Engineering, vol. 18, no. 1, pp. 20-28, 2010.
11
[240] J. d. R. Millán, "Brain-Machine Interfaces: The Perception-Action Closed Loop: A Two-
12
13
Learner System," IEEE Systems, Man, and Cybernetics Magazine, vol. 1, no. 1, pp. 6-8,
14 2015.
15 [241] M. S. Fifer et al., "Simultaneous Neural Control of Simple Reaching and Grasping with
16 the Modular Prosthetic Limb using Intracranial EEG," 2014.

us
17 [242] D. P. McMullen et al., "Demonstration of a Semi-Autonomous Hybrid Brain–Machine
18 Interface Using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control
19
a Robotic Upper Limb Prosthetic," Neural Systems and Rehabilitation Engineering, IEEE
20
21 Transactions on, vol. 22, no. 4, pp. 784-796, 2014.
22 [243] R. Leeb, R. Chavarriaga, S. Perdikis, I. Iturrate, and J. d. R. Millán, "Moving Brain-
23
24
25
26
27
28
an
Controlled Devices Outside the Lab: Principles and Applications," in Recent Progress in
Brain and Cognitive Engineering: Springer, 2015, pp. 73-94.
[244] C. Vidaurre, C. Klauer, T. Schauer, A. Ramos-Murguialday, and K.-R. Müller, "EEG-
based BCI for the linear control of an upper-limb neuroprosthesis," Medical Engineering
& Physics, vol. 38, no. 11, pp. 1195-1204, 2016.
dM
29 [245] J. P. Cunningham, P. Nuyujukian, V. Gilja, C. A. Chestek, S. I. Ryu, and K. V. Shenoy,
30 "A closed-loop human simulator for investigating the role of feedback control in brain-
31 machine interfaces," Journal of neurophysiology, vol. 105, no. 4, pp. 1932-1949, 2011.
32 [246] M. C. Dadarlat, J. E. O'doherty, and P. N. Sabes, "A learning-based approach to artificial
33 sensory feedback leads to optimal integration," Nature neuroscience, vol. 18, no. 1, pp.
34
138-144, 2015.
35
36
[247] S. N. Flesher et al., "Intracortical microstimulation of human somatosensory cortex,"
37 Science translational medicine, vol. 8, no. 361, pp. 361ra141-361ra141, 2016.
38 [248] F. D. Broccard et al., "Closed-Loop Brain–Machine–Body Interfaces for Noninvasive
pte

39 Rehabilitation of Movement Disorders," Annals of biomedical engineering, pp. 1-21,


40 2014.
41 [249] S. Suresh, Y. Liu, and R. C.-H. Yeow, "Development of a Wearable
42
Electroencephalographic Device for Anxiety Monitoring," Journal of Medical Devices,
43
44 vol. 9, no. 3, p. 030917, 2015.
45 [250] J. Saab, B. Battes, and M. Grosse-Wentrup, Simultaneous EEG recordings with dry and
46 wet electrodes in motor-imagery. na, 2011.
ce

47 [251] T. O. Zander et al., "A dry EEG-system for scientific research and brain–computer
48 interfaces," Frontiers in neuroscience, vol. 5, p. 53, 2011.
49 [252] T. R. Mullen et al., "Real-time neuroimaging and cognitive monitoring using wearable
50
51
dry EEG," IEEE Transactions on Biomedical Engineering, vol. 62, no. 11, pp. 2553-
52 2567, 2015.
Ac

53 [253] Y. Chen et al., "A high-security EEG-based login system with RSVP stimuli and dry
54 electrodes," IEEE Transactions on Information Forensics and Security, vol. 11, no. 12,
55 pp. 2635-2647, 2016.
56
57
58 42
59
60
Page 43 of 43 AUTHOR SUBMITTED MANUSCRIPT - JNE-102497.R2

1
2
3 [254] M. Ordikhani-Seyedlar, M. A. Lebedev, H. B. Sorensen, and S. Puthusserypady,
4

pt
5
"Neurofeedback therapy for enhancing visual attention: state-of-the-art and challenges,"
6 Frontiers in Neuroscience, vol. 10, 2016.
7 [255] S. Wyckoff and N. Birbaumer, "Neurofeedback and Brain–Computer Interfaces," The
8 Handbook of Behavioral Medicine, pp. 275-312, 2014.
9 [256] R. Abiri, J. McBride, X. Zhao, and Y. Jiang, "A real-time brainwave based neuro-

cri
10 feedback system for cognitive enhancement," in ASME 2015 Dynamic Systems and
11
Control Conference (Columbus, OH), 2015.
12
13
[257] N. J. Steiner, E. C. Frenette, K. M. Rene, R. T. Brennan, and E. C. Perrin, "In-school
14 neurofeedback training for ADHD: sustained improvements from a randomized control
15 trial," Pediatrics, pp. peds. 2013-2059, 2014.
16 [258] N. J. Steiner, E. C. Frenette, K. M. Rene, R. T. Brennan, and E. C. Perrin,

us
17 "Neurofeedback and cognitive attention training for children with attention-deficit
18 hyperactivity disorder in schools," Journal of Developmental & Behavioral Pediatrics,
19
vol. 35, no. 1, pp. 18-27, 2014.
20
21 [259] R. Abiri, X. Zhao, and Y. Jiang, "A Real Time EEG-Based Neurofeedback platform for
22 Attention Training," in Biomedical Engineering Society Annual Meeting (BMES 2016),
23
24
25
26
27
28
2016.

an
[260] Y. Jiang, R. Abiri, and X. Zhao, "Tuning Up the Old Brain with New Tricks: Attention
Training via Neurofeedback," Frontiers in aging neuroscience, vol. 9, 2017.
[261] D. S. Bassett and A. N. Khambhati, "A network engineering perspective on probing and
perturbing cognition with neurofeedback," Annals of the New York Academy of Sciences,
dM
29 2017.
30 [262] R. Abiri, X. Zhao, and Y. Jiang, "Controlling gestures of a social robot in a brain
31 machine interface platform," in 6th International Brain-Computer Interface Meeting
32 (2016 BCI), 2016, p. 122.
33 [263] R. Abiri, S. Borhani, X. Zhao, and Y. Jiang, "Real-Time Neurofeedback for Attention
34
Training: Brainwave-Based Brain Computer Interface," in Organization for Human
35
36
Brain Mapping (OHBM 2017), 2017.
37
38
pte

39
40
41
42
43
44
45
46
ce

47
48
49
50
51
52
Ac

53
54
55
56
57
58 43
59
60

You might also like