NeuroKeyboard PDF
NeuroKeyboard PDF
net/publication/319941041
CITATIONS READS
0 722
2 authors, including:
Krzysztof Dobosz
Silesian University of Technology
18 PUBLICATIONS 20 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Applications supporting the rehabilitation of people with cognitive disabilities View project
All content following this page was uploaded by Krzysztof Dobosz on 27 November 2017.
1 Introduction
Communication and the ability to interact with the environment, especially to
other people, are fundamental needs for human relationships. For people who
suffer from severe physical disabilities, because they are completely paralyzed,
ability to comply with this need is almost impossible. Their physical activities
very often are limited to eye blinking. Although it is sufficient to communicate
using Morse code - this method is very cumbersome. Human-Computer Interac-
tion (HCI) researchers explore opportunities of using as many sensors channels
as possible [24]. Among others, in support to impaired physical activity, bioelec-
trical brain signals can be used to provide an alternative communication channel.
A Brain-Computer Interface (BCI) is able to recognize changes in the ongoing
bioelectrical signals and to use them to appropriate commands in order to realize
the communication aids.
The aim of the project was to propose useful BCI, which allow users with
severe motor disabilities to use virtual keyboard. The electroencephalographic
(EEG) devices mostly measure fundamental human activity states as atten-
tion and relaxation (meditation). The devices sometimes integrates also an elec-
tromyographic sensor that can be used for eye blinks recognition. Such signals
2 Krzysztof Dobosz and Klaudiusz Stawski
2 Related Works
3 Research Environment
The main element of the research environment is the EEG device. One of com-
monly available consumer of EEG signals is the NeuroSky MindWave Mobile
[15]. Devices developed by this producer are inexpensive, work well using a dry
electrode and have their own Software Development Kit, so software develop-
ers can easily create own applications. The NeuroSky Mindwave Mobile has also
good measurement accuracy, which can result in a wider group of potential users.
Touchless Virtual Keyboard Controlled by Eye Blinking and EEG Signals 3
The device is composed of one dry electrode and a specially designed elec-
tronic circuit. It covers of a headset, an ear-clip, and a sensor arm. The headset’s
reference and ground electrodes are on the ear clip and the EEG electrode is on
the sensor arm, resting on the forehead above the eye. The MindWave Mobile
safely measures and outputs the EEG power spectrum (alpha waves, beta waves,
etc.). The sensor arm includes also EMG sensor that allows the user to measure
the blink strength. The headset transfers data via Bluetooth.
Typically, real time EEG signal processing and classification algorithms are
designed for powerful machines. Some of them use a weighted combination of
various classifiers. However, the producer’s firmware reduces the complexity of
managing the connection and handles parsing of the data stream from the EEG
headset. This comfortable software interface supplies both raw data and pre-
processed data. The software developer receives the value of attention and med-
itation normalized to the range 0-100%. Movement of muscles responsible for
blinking of the eye is normalized in the same way. It is very convenient and
helps avoid raw data analysis.
They are:
– text area presenting the typed text - selected characters are automatically
added at the end of the text,
– virtual keyboard containing five rows of keys and the list of predicted words,
– configuration panel - informs about the state of connection to the MindWave
device, covers following information: current values of attention, meditation
and blinks, configuration settings, and Calibrate button that is used to per-
sonalize the thresholds,
– simulation panel - covers two screen buttons: DoubleBlink and ChangeMode.
The virtual keyboard allows the user to personalize operating parameters.
Settings are: meditation threshold, attention threshold, time above the thresh-
olds, eye blinking strength, time period for eye double-blink, interval of switching
between the keyboard sections.
4 Method
The virtual keyboard can work in two modes: character selection, predicted word
selection. Expecting that all operations controlled by EEG signals will be real-
ized very slowly, we decided that the switch between the keyboard sections will
be executed automatically at specified intervals. Assuming the switch between
single keys takes place every second, then enter a ten-letter word would take
3-5 minutes, which is unsatisfactory result. In order to accelerate the selection
of the correct character, the Divide and Conquer algorithm was implemented.
Therefore, first a row of characters is selected. Highlighted row of keys changes
to the next one at a certain time until the selection. The change of rows pro-
ceeds automatically from top to bottom (Fig.2). Then the first row is highlighted
again.
After the row selection, the line of six characters is divided into two parts,
which are alternately highlighted. After the part selection, current group of three
Touchless Virtual Keyboard Controlled by Eye Blinking and EEG Signals 5
characters are divided into single keys (Fig.3). They are also alternately high-
lighted in certain time period. Finally, after the key selection, the corresponding
character is inserted at the end of the text area.
5 Experiments
blink) should be set to 1.5s. Table 1 presents results of the experiment (time
- in seconds, errk - number of incorrectly selected keys), when the users typed
the pangram without support of predicted words. Calculated text entry factors
are: CP M = 5.52, what is about W P M = 1.11 (considering that 1 word is equal
to about 5 characters for English text), calculated MSD error rate [22] equals
7.03%.
Although the keyboard can be calibrated in any moment, for the experiment
including predicted words, we decided to set the attention and meditation thresh-
olds manually to study its influence. Their values were selected after preliminary
observations. Too low threshold caused accidental switches to the mode of pre-
dicted words. Too high value caused problems with maintaining a sufficiently
high level for a certain period of time. After initial tests we decided to set the
required period over the attention and meditation threshold should be equal to 3
seconds. Next tables (Table 2 and Table 3) presents results, when the key typing
is supported by the predicted words. The attention threshold thrA and medita-
tion threshold thrM range were set to 60-80. It was selected after preliminary
observations. The value of errk was very similar to the previously obtained, so it
is replaced by errs - number of incorrect switches to the list of predicted words.
Unfortunately, only three words of the pangram belong to the dictionary
(ICH, GDY, NIE), but they have taken first places on the lists, and consequently
they were quickly selected together with the whitespace at the end. The best
results were obtained for thrA equals 65 (W P M = 1.21) and thrM equals 80
(W P M = 1.27). The average number of unexpected switching erravr to the list
of predicted words was decreased when the thresholds were increased. In the case
of thrM = 80 only one of users had accidental switches to the mode of predicted
words.
Touchless Virtual Keyboard Controlled by Eye Blinking and EEG Signals 7
6 Discuss
The time of the single key selection depends on number of the time periods
tp representing the interval of keyboard section change. Provided studies used
tp = 1500ms. This time period includes the time for the user’s decision about the
selection and the time of double blink tdb . Assuming the user needs full tp time
for decision and blinking, it gives i.e. 3tp to insert ”A”, and 7tp to insert ”R”,
because sequentially highlighted are: 3 rows, 2 groups (in the third row), 2 keys
(in the second group of the third row). Therefore, the arithmetic average time
of character insertion (including a space key) is 6.22 ∗ tp . However, it is 6.51 ∗ tp
for the pangram used for studies. After simple calculation we get the longest
faultless expected time of entry the pangram. It equals 498s (W P M = 1.23).
On the other hand the users can blink immediately when the required row,
group of keys or a key is highlighted. The sequence of keys is easy to remember
(in our research tool it is alphabetical order), the experienced user will know in
advance when he has to blink. Since each key is reached after three double blinks,
then three of tp periods can be shortened to 3tdb . Hence, i.e. the user needs 3tdb
to insert ”A”, and 4tp + 3tdb to insert ”R”. The duration of a single eye blink
is 100-400ms [1], double eye can be max. 800ms. During studies we assumed,
that 600ms will be sufficient tdb period for non-accidental double eye blink.
Repeating the calculation, this time the arithmetic average time of character
insertion is 3.22 ∗ tp + 3 ∗ tdb . However, for the pangram used for studies it is
3.51 ∗ tp + 3 ∗ tdb . Hence, we get the best possible time of the pangram entry
equals 360.3s, (W P M = 1.7).
Summarizing, expected users results of typing without words prediction should
be in the range 360.3-498 seconds. The only one test result (third user in forth
trial) belongs to this range (Table 1). However, we need to remember the users
were beginners without experience in such kind of HCI. They made many delays:
they selected incorrect keyboard sections by accidental double blink or thought
too long exceeding the time of highlighting. Considering this, the obtained re-
sults can be estimated as satisfactory. We can expect some improvement after
long term training.
Going further and having regard to experienced users, every tp period can
be set to tdb value. Then the text entry speed for the pangram can grow to
W P M = 3.15. It corresponds to 3.8 seconds per key selection. Such kind of
interaction does not need the EEG support in the form described in this paper.
However, such solution (with the very short time for user interaction) would
be hard to use resulting in many errors, and consequently would cause reduced
efficiency.
7 Conclusions
Proposed touchless keyboard uses eye double blinks. The obtained result during
evaluation is W P M = 1.11 with the MSD error rate equals 7.03%. In the area
of such kind of text entry systems, this result is good, although after theoretical
calculations we expected the value of non-error writing in the range of 1.23 − 1.7.
Touchless Virtual Keyboard Controlled by Eye Blinking and EEG Signals 9
Controlling the keyboard with the blinks can be supported by EEG signals.
This is the novelty in the proposed approach. The results of experiments show
that the best mode is the mode using meditation threshold of 80. It achieves
efficiency of 1.27 W P M , but we should note that the words prediction was used.
Selecting the meditation as a parameter for the mode of predicted words, the
number of errors is lower in comparison to the experiment using attention. This
is due to the fact that when using the application, the user unwittingly focused
on the action, eg. the selection of the letters, which can inadvertently changes
the mode of application. Whereas a longer meditation above the threshold can
be easy achieved by closing eyes, which is done only for the intended purpose.
It is also possible to introduce improvements in the keyboard. The Divide and
Conquer algorithm could be used breaking down a set of keys into two subsets
every time. However, it would involve the reconfiguration of the number of rows
and columns, which could not be comfortable in use. Next, the order of keys
could be dependent on the frequency of corresponding characters in the national
language. Such HCI solution would require high activity of eyes, which could be
very uncomfortable for the user. During studies, after only first attempt, most
users complained of eye strain.
Acknowledgment
References
1. B10Numb3r5: Average duration of a single eye blink (Apr 2008), http://
bionumbers.hms.harvard.edu/bionumber.aspx?s=y&id=100706
2. Cecotti, H.: A self-paced and calibration-less ssvep-based brain–computer inter-
face speller. IEEE Transactions on Neural Systems and Rehabilitation Engineering
18(2), 127–133 (2010)
3. Chapin, J.K., Moxon, K.A., Markowitz, R.S., Nicolelis, M.A.: Real-time control of
a robot arm using simultaneously recorded neurons in the motor cortex. Nature
neuroscience 2(7), 664–670 (1999)
4. Duchowski, A.T.: Eye tracking methodology: Theory and practice. Springer (2017)
5. Hwang, H.J., Kim, S., Choi, S., Im, C.H.: Eeg-based brain-computer interfaces: a
thorough literature survey. International Journal of Human-Computer Interaction
29(12), 814–826 (2013)
6. Kang, K., Hong, J.: A framework for computer interface using eeg and emg. In:
Proceedings of the 2015 Conference on research in adaptive and convergent sys-
tems. pp. 472–473. ACM (2015)
7. Krapic, L., Lenac, K., Ljubic, S.: Integrating blink click interaction into a head
tracking system: implementation and usability issues. Universal Access in the In-
formation Society 14(2), 247–264 (2015)
8. Królak, A., Strumillo, P.: Eye-blink detection system for human–computer inter-
action. Universal Access in the Information Society pp. 1–11 (2012)
10 Krzysztof Dobosz and Klaudiusz Stawski
9. Lin, C.T., Chen, Y.C., Huang, T.Y., Chiu, T.T., Ko, L.W., Liang, S.F., Hsieh,
H.Y., Hsu, S.H., Duann, J.R.: Development of wireless brain computer inter-
face with embedded multitask scheduling and its application on real-time driver’s
drowsiness detection and warning. IEEE Transactions on Biomedical Engineering
55(5), 1582–1591 (2008)
10. Lin, C.T., Lin, F.C., Chen, S.A., Lu, S.W., Chen, T.C., Ko, L.W.: Eeg-based
brain-computer interface for smart living environmental auto-adjustment. Journal
of Medical and Biological Engineering 30(4), 237–245 (2010)
11. MacKenzie, I.S., Soukoreff, R.W.: Phrase sets for evaluating text entry techniques.
In: CHI’03 extended abstracts on Human factors in computing systems. pp. 754–
755. ACM (2003)
12. Majaranta, P.: Communication and text entry by gaze. Gaze interaction and ap-
plications of eye tracking: Advances in assistive technologies pp. 63–77 (2012)
13. Marshall, D., Coyle, D., Wilson, S., Callaghan, M.: Games, gameplay, and bci:
the state of the art. IEEE Transactions on Computational Intelligence and AI in
Games 5(2), 82–99 (2013)
14. McFarland, D.J., Krusienski, D.J., Sarnacki, W.A., Wolpaw, J.R.: Emulation of
computer mouse control with a noninvasive brain–computer interface. Journal of
neural engineering 5(2), 101 (2008)
15. NeuroSky: Neurosky mindwave mobile eeg, http://store.neurosky.com/pages/
mindwave, (accessed: Dec 2016)
16. Obermaier, B., Muller, G.R., Pfurtscheller, G.: ” virtual keyboard” controlled by
spontaneous eeg activity. IEEE Transactions on Neural Systems and Rehabilitation
Engineering 11(4), 422–426 (2003)
17. Păsărică, A., Bozomitu, R.G., Cehan, V., Rotariu, C.: Eye blinking detection to
perform selection for an eye tracking system used in assistive technology. In: Design
and Technology in Electronic Packaging (SIITME), 2016 IEEE 22nd International
Symposium for. pp. 213–216. IEEE (2016)
18. Perego, P., Turconi, A., Andreoni, G., Gagliardi, C.: Cognitive ability assessment
by brain-computer interface ii: application of a bci-based assessment method for
cognitive abilities. Brain-Computer Interfaces 1(3-4), 170–180 (2014)
19. Scherer, R., Muller, G., Neuper, C., Graimann, B., Pfurtscheller, G.: An asyn-
chronously controlled eeg-based virtual keyboard: improvement of the spelling rate.
IEEE Transactions on Biomedical Engineering 51(6), 979–984 (2004)
20. Scott MacKenzie, I., Ashtiani, B.: Blinkwrite: efficient text entry using eye blinks.
Universal Access in the Information Society 10(1), 69–80 (2011)
21. Scott MacKenzie, I., Ashtiani, B.: Blinkwrite: efficient text entry using eye blinks.
Universal Access in the Information Society 10(1), 69–80 (2011)
22. Soukoreff, R.W., MacKenzie, I.S.: Measuring errors in text entry tasks: an appli-
cation of the levenshtein string distance statistic. In: CHI’01 extended abstracts
on Human factors in computing systems. pp. 319–320. ACM (2001)
23. Stamps, K., Hamam, Y.: Towards inexpensive bci control for wheelchair navigation
in the enabled environment-a hardware survey. Brain informatics 6334, 336–345
(2010)
24. TAN, D., Nijholt, A.: Brain-computer interaction: Applying our minds to human-
computer interaction (2010)