Brain-Computer Interface Learning System For Quadriplegics
Brain-Computer Interface Learning System For Quadriplegics
Abstract—The proposed Brain-Computer Interface system generated by the brain. This neural activity can be
enables Quadriplegic patients, people with severe motor recorded using invasive and non-invasive methods.
disabilities to send commands to electronic devices and Electroencephalogram was found by Hans Berger in 1929.
communicate with ease. Interactive sessions are vital for Several progresses have been made in last 30 years
effective knowledge transfer in any learning eco-system. The improving the state of recording the brainwaves.
growth of Brain-Computer Interface (BCI) has led to rapid
development in 'Assistive Systems' for the disabled called
The General functional model of BCI System is discussed
'assistive domotics'. Brain-Computer-Interface is capable of below as shown in figure 1.
reading the brainwaves of an individual and analyse it to
obtain some meaningful data. This processed data can be
used to assist people having speech disorders and sometimes
people with limited locomotion to communicate. In this
Project, Emotiv EPOC Headset is used to obtain the
electroencephalogram (EEG). The obtained data is processed
to communicate pre-defined commands and queries for
interactive learning. EEG data can also be used to monitor
student's emotional behaviour and provide emotional
feedback to the students. Other Vital Information like the
heartbeat, blood pressure, ECG and temperature are
monitored and uploaded to the server. The Data is processed
in Intel Edison, system on chip (SoC). Patient metrics are
displayed via Intel IoT Analytics cloud service.
III. METHODOLOGY
259
combination of keystrokes. The desired expression is
mapped with a specific keystroke with key hold time and
trigger options. Sensitivity is set to a particular trigger
level. For example, the Imagery of a glass of water is
mapped to letter 'w', which is decoded as “I need a glass of
water” command. Figure 6 shows the EmoKey setup.
The EmoKey software acts as a bridge connecting Emotiv Figure 6: EmoKey Creation
Control panel to the Intel Arduino IDE. Each Action
responsible for a particular command is mapped with any These EmoKey's along with Intel Edison allow a disabled
one character of the standard keyboard. After the optimal person to communicate their basic needs using pre-defined
connection of electrodes, EEG signals are read by the commands. These commands can also be used to control
headset. After feature extraction, the obtained variable is an application or hardware to serve their personal needs.
compared against the stored EmoKey variable set. In case
of a hit, the particular action is performed and then starts IV. EXPERIMENTAL RESULTS
to read EEG signal again.
Overall working of the Emotiv EPOC headset has IoT Kit agent runs a daemon on the device,
been analyzed and tested using the company firmware listening for simple messages from other processes and
Emotiv SDK community Edition as shown in figure 4. handling the necessary message formatting and security to
Facial Expressions, Mental commands were tested and send observations to the cloud. The putty terminal is used
trained using the Emotiv Control Panel and EmoBot to configure Intel Edison and establish connection with
applications as shown in figure 5. IoT analytics server. Iot kit-Admin provides utility
functions, such as testing the network, activating a device,
registering time series, registering components, providing
proxy information regarding the network, and sending test
observations.
In the cloud dashboard, components are added for
data access and analysis. A component is either a time
series which consists of sensor observations sent from the
device, or an actuator, which allows commands to be sent
to a device. In Figure 7, the Catalogue of devices
registered is shown.
EmoKey links any detected EEG wave pattern into any Figure 7: dashboard – Catalogue containing components
registered
260
After proper logging of device and communication VI. FUTURE RESEARCH DIRECTIONS
channel, data obtained from the sensors are sent to the
cloud using 'Iotkit' library from Intel Arduino. Emotional Stability is considered as the degree to which a
Intel Edison is connected to Twilio by creation of person is well adjusted, calm and secure. Emotions
a Twilio account and installing Curl and Crypto libraries profoundly influence learning. Positive emotions are
needed for its functioning. The programming part for emotions that are experienced as pleasant and vary in
interfacing Edison with Twilio is done through C++. terms of the physiological and cognitive activation.
Positive emotions influence learning by affecting students’
Proper Data logging of the sensor data was attention, motivation, use of learning strategies and self-
achieved. These data can be visualized in form of graphs regulation of learning. Negative emotions draw students’
in the cloud. The user-defined commands controlled by attention away from learning and concern about
the EEG waves are displayed in the Arduino IDE Serial performance and its consequences, thereby reducing task-
Monitor. This system is capable of synthesizing voice focused attention. Negative emotions are a major factor
using a Text-to-speech (TTS) Converter. Using explaining why many students do not live up to their
EmoKey’s, one can emulate voice-synthesis for disabled potential and fail to pursue the educational career that
people by using an Interface enabling sentence formation would correspond to their abilities and interests.
and machine learning with Inputs from the EEG data. EEG data can be used to monitor student's
Figures 8 and 9 show the observations logged from the emotional behaviour and provide emotional feedback to
sensors. These collected data are used for further analytics the students. This feedback enables students to improve
and plotting. their emotional stability and learning process. Brainwave
analysis can also be extended to ascertain one's learning
style at an early age.
Study of an individual's response to various
visual, auditory and kinaesthetic stimuli can be used to
determine the most preferred learning style for effective
assimilation of knowledge.
VII. ACKNOWLEDGEMENT
261
based brain–computer interfaces',J. Neural Eng. 4, R1–
R13 JOURNAL OF NEURAL ENGINEERING, 2007.
8. Upkar Varshney, 'Pervasive Healthcare and Wireless
Health Monitoring',Mobile Netw Appl 12:113–127,
Springer Science + Business Media, 2007.
9. Pari Jahankhani, Vassilis Kodogiannis and Kenneth
Revett, 'EEG Signal Classification Using Wavelet Feature
Extraction and Neural Networks', IEEE John Vincent
Atanasoff 2006 International Symposium on Modern
Computing.
10. Gerwin Schalk, Dennis J. McFarland, Thilo
Hinterberger, Niels Birbaumer, and Jonathan R. Wolpaw,
'BCI2000: A General-Purpose Brain-Computer Interface
(BCI) System', IEEE TRANSACTIONS ON
BIOMEDICAL ENGINEERING, VOL. 51, NO. 6, JUNE
2004.
11. Kern SE, Jaron D, 'Healthcare technology, economics
and policy: an evolving balance', IEEE Eng Med Biol Mag
22:16–19, Jan–Feb, 2003.
12. http://www.emotiv.com, http://www.neurosky.com
and IoT Analytics guide from Intel
(https://software.intel.com/en-us/intel-iot-platforms-
getting-started-cloud-
analytics#Setting_up_your_IoT_Analytics_account)
262