0% found this document useful (0 votes)
111 views

A Seminar Report On Brain Computer Interface

The document provides an overview of brain-computer interfaces (BCIs). It defines a BCI as a direct technological interface between a brain and computer that does not require motor output. The document discusses BCI technology, including EEG and neural implants. It provides three examples of BCIs: 1) Monkeys learned to control a robotic arm with their brain signals and their brain adapted to treat the robotic arm as their own. 2) The company Cyberkinetics is leading BCI research. 3) A non-invasive BCI was used to control a mobile robot. The document covers various aspects of BCIs including examples, technology, and implications.

Uploaded by

Anit Sachdeva
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
111 views

A Seminar Report On Brain Computer Interface

The document provides an overview of brain-computer interfaces (BCIs). It defines a BCI as a direct technological interface between a brain and computer that does not require motor output. The document discusses BCI technology, including EEG and neural implants. It provides three examples of BCIs: 1) Monkeys learned to control a robotic arm with their brain signals and their brain adapted to treat the robotic arm as their own. 2) The company Cyberkinetics is leading BCI research. 3) A non-invasive BCI was used to control a mobile robot. The document covers various aspects of BCIs including examples, technology, and implications.

Uploaded by

Anit Sachdeva
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 16

A

Seminar Report
On
Brain Computer
Interface

Submitted To: Submitted By:


Ms.Kulpreet Oberoi Sandeep
Kumar
Ms. Sonia Arora 1031/07
Ms. Mamta Bhola MCA 4 th Sem.
Mr. Neeraj Gupta
Contents

 Brain Computer Interface

 What is a Brain Computer Interface?

 Technology of Brain Computer interface

 Example of Brain Computer Interface

 BCI Example 1: Monkey Adapt Robot Arm as Their Own

 The company Cyberkinetics is leading research on Brain-Computer

Interfaces in BCI Example 2:Cyberkinetics

 BCI Example 3:Non-Invasive Brain-Actuated control of a Mobile

Robot

 Neural/BIO Feedback

 Ethical and Social Implications of Brain-Computer Interfaces

 Conclusion
Brain-Computer Interfaces

In this chapter an overview is given of the current state of the art in brain-
computer interfaces. This is done by laying out the technology used for brain-
computer interfaces and by reviewing several case studies.

What is a Brain-Computer Interface?

A lot of research has been done on brain-computer interfaces and many definitions have
been given. Wolpaw et al. (2002) define a brain-computer interface as “a device that
provides the brain with a new, non-muscular, communication and control channel”.
Levine (2000) says “A direct brain-computer interface accepts voluntary commands
directly from the human brain without requiring physical movement and can be used to
operate a computer or other technologies.” Kleber and Birbaumer (2005) say that “A
brain-computer interface provides users with the possibility of sending messages and
commands to the external world without using their muscles”.

The following is the definition for brain-computer interfaces that will be used in this
thesis.

A brain-computer interface is a direct technological interface between


a brain and a computer that does not require any motor output from the
user. Neural impulses in the brain are intercepted and used to control an
electronic device.
To explain and justify the reasons for researching brain-computer interfaces as
an input method for wearable computers, a description of a ‘perfect’ brain-
computer interface is given as follows.
A perfect brain-computer interface is such that any interaction a user wants with
a computer is understood by that computer, at the moment the user wants that

interaction to take place. This should cover any desirable interaction a user could
want with a computer, such as menu navigation, text input, and pointer control.
This interaction should cause no straining or monopolization of the mind and
should be performed as though it were as natural as moving an arm or a leg.

It is understood that the described ‘perfect’ brain-computer interface is beyond


current technology, but is meant as a model against which brain-computer
interfaces can be evaluated.

Technology of Brain Computer Interfaces

Since the discovery of the electroencephalogram or EEG i in 1969, by Pierre


Gloor and Hans Berger, the Brain Computer Interface was a reality. Having the
possibility of electronically registering brain activity enables one to use this
electronic output to control any electronic device.

The study of brain activity has shown that when a person performs a certain task
such as stretching the right arm, a ‘signal’ is created in the brain and is sent
through the nervous system to the muscles. Research has shown that when the
same person moves his arm in the same way ten times, there is clearly a pattern
to the neural activity. Scientists (Carmen et al., 2004) have therefore concluded
that if one is able to read the brain activity and scan for certain specific patterns,
this information can then be used for interaction. The more specific the
registering of the neural activity, the more precise and detailed the possible
interaction.

There are several methods of monitoring brain activity. These methods can be
divided into three distinct groups, namely external, surgical, and
nanotechnological.
Firstly there are several external or non-invasive methods of monitoring brain
activity including Positron Emission Tomography (PET) ii , functional Magnetic
Resonance Imaging (fMRI) iii , Magneto encephalography (MEG) iv , and
Electroencephalography (EEG) techniques, which all have advantages and
disadvantages.

In practice only EEG yields data that is easily recorded with relatively
inexpensive equipment, is rather well studied, and provides high temporal
resolution (Krepki et al., 2003). EEG is therefore the only external method to be
researched in this thesis.

In order to use EEG as an interface method, electrodes need to be placed over


the scalp of a user. These electrodes pick up the neural activity in the regions of
the electrodes. The readings made by an EEG, however, are considered very
‘fuzzy’ (Millan et al. 2003). Because the electrodes are relatively far from the
brain, it is not possible to pick up the activity of individual brain cells. Lack of
accuracy is therefore the main shortcoming of the EEG method, but this is
counterbalanced against the non-invasive nature of EEG. The electrodes that are
commonly used are placed on the scalp with a paste to ensure conductivity. This
fact makes EEG a very obtrusive and restrictive method for use with a wearable
computer. The electrodes are both socially unacceptable and physically
uncomfortable.

Secondly, brain activity can be measured by surgically implanting computer


chips with microelectrodes onto the brain, thus enabling the measurement of
activity in certain areas of the brain (Nicolelis 2003). This technology allows for
the reading of the activity of individual neurons, making it a very precise
method. There are certain risks involved with applying these chips to the brain.
In practice, this method is used only on laboratory animals and people who are
completely paralyzed.

Finally nanotechnology v offers a method of reading brain activity.


Nanotechnology, being the young field of science that it is, is also the least
developed method of registering neural activity. However, the technology is so
promising that it is worth considering in this thesis. Neuroscientist Llinas (2005)
and his colleagues describe a method of creating platinum nanowires 100 times
thinner than a human hair and using blood vessels as conduits to guide the wires
to monitor individual brain cells.

Llinas (2005) is working on creating a method for feeding an entire array of


nanowires through a catheter tube, so that the wires could be guided through the
circulatory system to the brain. The nanowires would then spread out into a kind
of bouquet, branching out into thinner and thinner blood vessels, to reach
specific neurons.

These wires are far thinner than even the smallest blood vessels, so they could
be guided to any point in the body without blocking blood flow.

In a proof-of-principle experiment in which they first guided platinum


nanowires into the vascular system of tissue samples, they successfully detected
the activity of adjacent neurons.

Examples of Brain-Computer Interfaces

In order to clarify the working of brain-computer interfaces, some case studies


are given.

BCI Example 1: Monkeys Adapt Robot Arm as Their Own


At Duke University medical Center, neurobiologists have been teaching monkeys
to control external devices with their brain signals. Unexpectedly, the monkeys’
brain structures are adapting to treat the robotic arm as one of their own
appendages.

This finding has important implications for understanding the adaptability of the
primate brain and promises great possibilities for giving paraplegics physical
control over their environment.
Led by neurobiologist Miguel Nicolelis (2002) of Duke’s Center for
Neuroengineering, the experiments consisted of implanting an array of
microelectrodes, thinner than a human hair, into the frontal and parietal lobes of
the brains of two female rhesus macaque monkeys. A specially developed
computer system analyzed the faint signals from the electrode arrays and
recognized patterns that represented particular movement by a monkey’s arm.

Initially the monkeys were taught to use a joystick to position a cursor over a
target on a video screen and to grasp the joystick with a specified force (Figure
5-1).

Figure 0- 1 Monkey BCI control setup

After this initial training, the researcher made the cursor more than a simple
display, incorporating the dynamics, such as inertia and momentum, of a robot
arm functioning in another room. The performance worsened initially, but the
monkeys soon learned to deal with these new dynamics and became capable of
controlling the cursor, that reflects the movements of the robot arm.

Following this, the researchers removed the joystick and the monkeys continued
moving their arms in the air where the joystick used to be, still controlling the
robot arm in the other room. After a few days, the monkeys realized they did not
need to move their own arms in order to move the cursor. They kept their arms at
their sides and controlled the robot arm with only their brain and visual
feedback.

The extensive data drawn from these experiments showed that a large percentage
of the neurons become more ‘entrained’. In other words, their firing becomes
more correlated to the operation of the robot arm than to the monkey’s own arm.

According to Nicolelis (2002), this showed that the brain cells originally used for
the movement of their own arm had now shifted to controlling the robot arm. The
monkeys could still move their own arms, but the control for that had been
shifted to other brain cells.

Further analysis by Lebedev (2002) showed that the monkey could


simultaneously be doing one thing with his real arms and another thing with the
robot arm. He said “So, our hypothesis is that the adaptation of brain structures
allows the expansion of capability to use an artificial appendage with no loss of
function, because the animal can flip back and forth between using the two.
Depending on the goal, the animal could use its own arm or the robotic arm, and
in some cases both. This finding supports our theory that the brain has
extraordinary abilities to adapt to incorporate artificial tools, whether directly
controlled by the brain or through the appendages” said Nicolelis (2002).

In fact, the scientists suggest, that it is a fundamental trait of higher primates,


that their brains have the adaptability to incorporate new tools into the very
structure of the brain.

The conclusion drawn from these experiments is that a brain-computer interface


becomes a ‘natural’ extension of the brain.

The company Cyberkinetics is leading research on brain-


computer interfaces in BCI Example 2: Cyberkinetics
The private sector. In 2004 the company took in its first patient, Matthew Nagle,
a quadriplegic, paralyzed from the neck down in a stabbing three years ago. He
has been participating in a clinical trial to test Cyberkinetic’s BrainGate system.
Sitting in his wheelchair, Nagle can now open e-mail, change TV channels, turn
on lights, play video games such as Tetris, and even move a robotic hand, just by
thinking.

The device, which is implanted underneath the skull in the motor cortex, consists
of a computer chip that is essentially a 2 mm by 2 mm
array of 100 electrodes (Figure 5-3). Surgeons Figure 0- 2

attached the electrode array like Velcro to neurons in electrode array

Nagle’s motor cortex (Figure 5-3), which is located in


the brain just above the right ear. The array is attached by a wire to a plug that
protrudes from the top of Nagle’s head.

Richard Martin (2005) visited Nagle for an article in Wired Magazine. After his
accident in 2001, Nagle begged to be
Cyberkinetics’ first patient. With his young
age and his strong will to walk again, he
turned out to be an ideal patient. The chip
was surgically implanted, and, after a period
Figure 0- 3
of recovery, the tests could start. Nagle had array position
to think ‘left’ and ‘right’, the way he was
able to move his hand before being paralysed. After only several days he
succeeded in controlling the cursor on a computer.

When asked what he was thinking, he replied: “For a while I was thinking
about moving the mouse with my hand. Now, I just imagine moving the cursor
from place to place.” His brain has assimilated the system. Nagle is now able to
play Pong (and even win), read e-mail, control a television set and control a
robot hand.

BCI Example3: Non-Invasive Brain-Actuated Control of a


Mobile Robot.
Krepki et al. (2003) researched brain control of a mobile robot, using non-
invasive EEG as the interface method.
Two subjects wore a commercial EEG cap and learned to control three mental
tasks during an initial training period. These tasks were chosen from the
following: “relax”, imagination of “left” and “right” hand (or arm) movements,
“cube rotation”, “subtraction”, and “word association”.

The three chosen commands were used to steer a robot forwards, right, and
left. This was trained over a period of days. After the training, the subjects were
able to steer the robot through a maze. Correct recognition of the commands was
above 60%, whereas errors were below 5%.

Neural/BIO Feedback

I n much the same way as you can directly control machines with your brain, you
can also receive feedback directly into the brain. Nicolelis (2004) is currently
working on artificial sense of touch for robotic arms. Successful experiments
have also been carried out with robotic eyes connected to the brain. Veraart et al.
(2002) have developed a system, whereby a video feed is sent to electrodes
activating the optical nerve. This currently gives blind people rudimentary
vision.

Ethical and Social Implications of Brain-Computer Interfaces

Controlling a system directly with one’s brain raises certain ethical issues. When
you can control external systems in the same way as you control your own body,
the question can arise of whether or not this system becomes part of you. Also,
the idea of integrating technology with their bodies can scare people off. These
issues are considered in this chapter.

As seen in the BCI examples of the research done by Nicolelis (2002) and the
Braingate System developed by Cyberkinetics (2005), the subjects using the
brain-computer interfaces assimilate the functionality into their brains. In fact,
the brains have reorganised the functions of specific neurons to improve the
functionality of the interface, at the same time moving the control of the original
appendages to other areas of the brain. The brain is adapted as though the body
has an extra limb. “From a philosophical point of view, we are saying that the
sense of self is not limited to our capability for introspection, our sense of our
body limits, and to the experiences we’ve accumulated.” said Nicolelis (2002).
“It really incorporates every external device that we use to deal with the
environment.” One could indeed ask: “Is it right to alter one’s Self?” But, does a
blind man not alter himself by using a cane?

In these cases too, a physical connection is made between the body and
technology, introducing the ‘cyborg’ into reality. The term ‘cyborg’ was first
mentioned by Clynes and Kline in 1960. “A cyborg is a combination of human
and machine in which the interface becomes a ‘natural’ extension that does not
require much conscious attention, such as when a person rides a bicycle.”
(Starner 1999)

Warwick (2003) suggests that, once the technology to directly interface with
the brain exits, technology becomes part of the ‘self’. People could then in fact
be ‘enhanced’, and this raises the issue of whether cyborg morals would be the
same as human morals. Warwick also states a ‘murky’ view, that cyborgs are
likely to be networked, and one could ask if it is morally acceptable for cyborgs
to give up their individuality.

Other questions arise. Should all humans have the right to be ‘upgraded’ to
cyborgs? How will the relationship between humans and cyborgs develop?

These are not current issues, but one should consider them when working with
brain-computer interfaces.
Conclusion

At the present time, computer technology is mature enough for wearable


computing. A computer with all the capabilities of a desktop computer can be
fitted into a package no larger than a pack of cards. There are heads-up displays,
that appear not much different from a pair of spectacles, and wireless networks
are widespread. Yet most of the wearable systems rely on speech control as an
input method. When judged against the desirable characteristics for wearable
computers, defined in chapter 3, the speech interface does not perform at all
well. It is socially very obtrusive; one feels uncomfortable speaking out
commands in public, and at the same time it intrudes on the surrounding space.
The lack of privacy in the interaction is also significant.

Also the tactile interfaces too do not deliver all desirable characteristics, as the
existing interfaces are fairly restrictive. A device either needs to be held in a
hand, or the hands are covered in interface technology.

It can therefore be concluded that there is a demand for an interface technology


that delivers more of the desirable characteristics for wearable computers than
those technologies in use at present.

The ‘perfect’ brain computer interface would indeed fulfil all the desirable
characteristics for wearable computers. However, in the current state of the art,
the ‘perfect’ brain computer interface does not exist. This is in spite of the
promising advances currently being made. The surgically implanted devices
show the best results for successful brain-computer interfaces, but the risk and
social antipathy is too large for it to be considered for non- medical use. To
paraplegics, the life enhancement value is so great that the risks and fears
associated with an operation do not matter.
The nanotechnological method promises the most precise method of
interaction. Even though it is unknown how the general public would perceive
such technology, it could be a far ‘cleaner’ and safer method than surgical
implants. The development still has a long way to go and can therefore currently
not be considered as ‘the way to go’ for interfacing with wearable computers.

This leaves the non-invasive method of EEG interfaces. The current level of
precision of such interfaces is sufficiently high for simple interfaces, such as
four-way menus. The reaction speed is still very slow, but, with smart software
algorithms, researchers are finding more and more efficient and precise
interaction solutions. There also needs to be more development on the electrodes
required for EEG reading, if people want to feel comfortable walking around
with them.

An EEG based brain-computer interface is quite feasible in the short term;


however, it could not replace the tactile interface for text input, but could be an
excellent interface for menu, mail, or web navigation. The most promising
approach to constructing an interface for wearable computers involves using a
combination of interface technologies. In this combination, the addition of an
EEG based interface would be an improvement over current interfaces for
wearable computers.

Further development could, however, much improve the current state of the art.
A great challenge for the development of brain-computer interfaces is the fact
that the field is exceptionally multi-disciplinary, covering fields such as
neurobiology, nanotechnology, engineering, mathematics, computer science, and
interaction design. And the current technology is a long way from the ‘perfect’
brain-computer interface.
References

 www.ieeexplore.ieee.org

 www.ask.com

 www.wikipedia.com

 www.howstuffworks.com

 www.cyberkineticsinc.com
i
“Electroencephalography is the neurophysiologic measurement of the electrical activity of the brain
by recording from electrodes placed to the scalp, or, in special cases, on the cortex. The resulting
traces are known as an electroencephalogram (EEG) and represent so-called brainwaves. This
device is used to assess brain damage, epilepsy, and other problems. In some jurisdictions it is used
to assess brain death. EEG can also be used in conjunction with other types of brain imaging.

ii
Positron emission tomography (PET) is a nuclear medicine medical imaging technique which
produces a three dimensional image or map of functional processes in the body. (www.wikipedia.org)

iii
Functional Magnetic Resonance Imaging (or fMRI) describes the use of MRI to measure
hemodynamic signals related to neural activity in the brain or spinal cord of humans or other animals.
(www.wikipedia.org)

iv
Magnetoencephalography (MEG) is the measurement of the magnetic fields produced by electrical
activity in the brain, usually conducted externally, using extremely sensitive devices such as SQUIDs.
Because the magnetic signals emitted by the brain are in the order of a few femtotesla (1 fT = 10 -
15
T), shielding from external magnetic signals, including the Earth's magnetic field, is necessary. An
appropriate magnetically shielded room can be constructed from Mu-metal, which is effective at
reducing high-frequency noise, while noise cancellation algorithms reduce low-frequency common
mode signals. (www.wikipedia.org)

v
usu “Nanotechnology comprises technological developments on the nanometer scale, ally 0.1 to 100
nm. (One nanometer equals 10 -9 m/one thousandth of a micrometer or one millionth of a millimeter.)
The term has sometimes been applied to microscopic technology.” (www.wikipedia.org)

You might also like