Reading The Brain From The Outside - Headache1
Reading The Brain From The Outside - Headache1
Reading The Brain From The Outside - Headache1
PATRICK KAIFOSH’S left hand lies flat on the table in front of him.
Occasionally his fingers twitch or his palm rises up slightly from the
surface. There is nothing obvious to connect these movements with what is
happening on the tablet in front of him, where a game of asteroids is being
played. Yet he is controlling the spaceship on the screen as it spins, thrusts
and fires.
Some say that the claims of Dr Kaifosh and Thomas Reardon, his co-
founder, that CTRL-Labs has created a brain-machine interface are
nonsense. The sweatband is nowhere near the brain, and the signals it is
picking up are generated not just by the firing of a motor neuron but by the
electrical activity of muscles. “If this is a BCI, then the movement of my
fingers when I type on a keyboard is also a brain output,” sniffs one
researcher. Krishna Shenoy, who directs the neural prosthetics systems lab
at Stanford University and acts as an adviser to the firm, thinks it is on the
right side of the divide. “Measuring the movement of the hand is motion
capture. They are picking up neural activity amplified by the muscles.”
Whatever the semantics, it is instructive to hear the logic behind the firm’s
decision to record the activity of the peripheral nervous system, rather
than looking directly inside the head. The startup wants to create a
consumer product (its potential uses include being an interface for
interactions in virtual reality and augmented reality). It is not reasonable
to expect consumers to undergo brain surgery, say the founders, and
current non-invasive options for reading the brain provide noisy, hard-to-
read signals. “For machine-learning folk, there is no question which data
set—cortical neurons or motor neurons—you would prefer,” says Dr
Reardon.
This trade-off between the degree of invasiveness and the fidelity of brain
signals is a big problem in the search for improved BCIs. But plenty of
people are trying to find a better way to read neural code from outside the
skull.
Even so, some EEG signals are strong enough to be picked up pretty
reliably. An “event-related potential”, for example, is an electrical signal
that the brain reliably gives off in response to an external stimulus of some
sort. One such, called an error-related potential (Errp), occurs when a user
spots a mistake. Researchers at MIT have connected a human observer
wearing an EEG cap to an industrial robot called Baxter as it carried out a
sorting task. If Baxter made a mistake, an Errp signal in the observer’s
brain alerted the robot to its error; helpfully, if Baxter still did not react,
the human brain generated an even stronger Errp signal.
Facebook is not saying much about what it is doing. Its efforts are being led
by Mark Chevillet, who joined the social-media giant’s Building 8
consumer-hardware team from Johns Hopkins University. To cope with
the problem of light scattering as it passes through the brain, the team
hopes to be able to pick up on both ballistic photons, which pass through
tissue in a straight line, and what it terms “quasi-ballistic photons”, which
deviate slightly but can still be traced to a specific source. The clock is
ticking. Dr Chevillet has about a year of a two-year programme left to
demonstrate that the firm’s goal of brain-controlled typing at 100 words a
minute is achievable using current invasive cell-recording techniques, and
to produce a road map for replicating that level of performance non-
invasively.
In the meantime, other efforts to decipher the language of the brain are
under way. Some involve heading downstream into the peripheral nervous
system. One example of that approach is CTRL-Labs; another is provided
by Qi Wang, at Columbia University, who researches the role of the locus
coeruleus, a nucleus deep in the brain stem that plays a role in modulating
anxiety and stress. Dr Wang is looking at ways of stimulating the vagus
nerve, which runs from the brain into the abdomen, through the skin to see
if he can affect the locus coeruleus.
Another approach is to put electrodes under the scalp but not under the
skull. Maxime Baud, a neurologist attached to the Wyss Centre, wants to
do just that in order to monitor the long-term seizure patterns of
epileptics. He hopes that once these patterns are revealed, they can be used
to provide accurate forecasts of when a seizure is likely to occur.
Yet others think they need to go directly to the source of action potentials.
And that means heading inside the brain itself.