‘Noise’
in the Brain Encodes Surprisingly Important Signals
Activity in the visual cortex and
other sensory areas is dominated by signals about body movements, down to
little tics and twitches. Scientists are now rethinking how they study and
conceive of perception.
[[Everything is much much more complicated than we thought.]]
New research
shows that perception can be deeply informed by movement: The visual
system may constantly process information about even the smallest
gestures.
Lenka Šimečková for Quanta Magazine
November 7, 2019
At every moment, neurons whisper,
shout, sputter and sing, filling the brain with a dizzying cacophony of voices.
Yet many of those voices don’t seem to be saying anything meaningful at all.
They register as habitual echoes of noise, not signal; as static, not
discourse.
Since scientists first became
capable of recording from single neurons 60 years ago, they’ve known that brain
activity is highly variable, even when there’s no obvious reason it should be. An
animal’s neural response to a repeated stimulus changes considerably from trial
to trial, fluctuating in a way that seems almost random. Even in the total
absence of a stimulus, “you would just record spontaneous activity, and you
would see that it seemed to have a mind of its own,” according to David
McCormick, a neuroscientist at the University of Oregon.
“This gave rise to a view of the
brain as being somehow either very [noisy], or using some type of high-level
statistics to get over this noisiness,” he said.
But over the past decade, that
view has changed. It’s become apparent that this purported randomness and
variability relates not just to messiness in the brain’s neural mechanics, but
also to behavioral states like arousal and stress — states that seem to affect
perception and decision-making as well. There’s more to all the noise,
scientists realized, than they had assumed.
Now, by analyzing both the neural
activity and the behavior of mice in unprecedented detail, researchers have
revealed a surprising explanation for much of that variability: Throughout
the brain, even in low-level sensory areas like the visual cortex, neurons
encode information about far more than their immediately relevant task. They
also babble about whatever other behaviors the animal happens to be engaging
in, even trivial ones — the twitch of a whisker, the flick of a hind leg.
Those simple gestures aren’t
just present in the neural activity. They dominate it.
The findings are changing how
scientists interpret brain activity, and how they design experiments to study
it.
An Elegant but Outdated Story
Until about a decade ago, most
neuroscientists used anesthetized animals in their experiments. This practice
enabled them to make incredible strides toward a better understanding of the
brain, but it also “led to a highly distorted view of neuronal
processing,” said David
Kleinfeld, a neurophysicist at the University of California, San Diego. It
was a particular drawback in vision research, because the levels of anesthesia
involved were often high enough to undercut confidence that the groggy (or
unconscious) animals were subjectively seeing anything at all. At the very
least, the anesthesia stripped away any real framework or context that might
influence the visual process.
Even in the dark, the neurons of
the visual cortex continue to chatter.
As a result, a certain picture of
how vision works emerged, one in which signals from the eyes moved through
passive sets of neurological filters that created increasingly specialized and
complex representations of the environment. Only later in the process did that
visual representation get integrated with information from other senses and
other brain areas. “It’s tempting to see primary sensory areas as cameras that
give an unadulterated view of what’s happening in the world,” said Anne Churchland, a neuroscientist at Cold Spring Harbor
Laboratory in New York — but however elegant this model of vision might
be, a mountain of evidence has proved it to be far too simplistic.
The brain’s immense
interconnectivity fills it with feedback loops that let higher cortical areas
talk to lower ones. Over decades of study, researchers have gradually found
each region of the brain to be less specialized than labels might suggest: The
visual cortex of people who are blind or visually impaired, for instance, can
process auditory and tactile information. The somatosensory cortex, and not the
motor cortex, was recently found to play a significant role in the learning of motor-based skills.
And broader forces like attention, expectation or motivation can affect how
people perceive.
Even in the dark, the neurons of
the visual cortex continue to chatter.
But things got even stranger in 2010. New experimental methods,
developed just a few years earlier, made it possible to record from the neurons
of mice running on a treadmill or ball. The neuroscientists Michael
Stryker and Cris Niell, then both at the University of California, San
Francisco, initially decided to use those techniques to compare the visual
activity in sleeping, anesthetized mice with that of mice which were
either running or stationary. (Niell has since moved to the University of
Oregon.)
Cris Niell, a
neuroscientist at the University of Oregon, uses a home-built imaging system
and spherical treadmill to measure the neural activity of mice as they run.
Courtesy of University of Oregon
They quickly realized, though,
that there was a more interesting comparison to make. “We went into it wanting
to study what’s different in awake versus anesthetized,” Niell said, “but what
was surprising to us was how different various aspects of the awake state were
when the animal was moving versus not.” They found that when a mouse ran,
its responses to visual stimuli got larger: The neurons’ firing rates doubled —
which was astonishing because in prior research, the firing rates hadn’t gone
up that much even in humans and monkeys required to complete tasks that
demanded intense visual attention. These effects tapered off as soon as the
mouse stopped running.
“That was pretty surprising,
because people had thought of primary visual cortex as being a purely sensory
area,” Churchland said.
At the time, in keeping with
conclusions drawn from previous work on attention and motivation, Stryker and
Niell thought that the result might reflect some general behavioral shift in
the mice, a transition from an inactive mode to an active one as the animals
started to move. Studies in other labs over the next few years confirmed that
general arousal alters neural responses considerably. In 2015, for instance,
McCormick and his colleagues reported that a mouse’s engagement with a
task predicted how well it would perform. That same year, the
neuroscientist Jessica Cardin and her team at Yale University began to disentangle the effects of both running and
arousal on activity in the visual cortex.
But all these experiments were
limited both in how many neurons they could record from at once and in how many
behavioral variables they could account for. Experiments that investigated
changes in brain activity based exclusively on whether the mouse was running or
still, or whether its pupils were dilated as a proxy for arousal, could
explain only bits and pieces of neural variability. A more complete explanation
was still wanting.
Now, by taking a more global
approach to both animal behavior and brain activity, a handful of research
groups have provided exactly that.
Finding Sense in the Swirling
Kaleidoscope
Kenneth Harris and Matteo Carandini,
neuroscientists at University College London, started with a different goal: to
characterize the structure of the spontaneous activity in the visual cortex
that occurs even when the rodent gets no visual stimulation. They and other
members of their joint team at the university’s Cortexlab recorded
from 10,000 neurons at once in mice that were free to act as they wanted — to
run, sniff, groom themselves, glance around, move their whiskers, flatten their
ears and so on — in the dark.
Marius
Pachitariu, a neuroscientist at the Howard Hughes Medical Institute’s Janelia
Research Campus in Virginia.
Matt Staley/Janelia Research
Campus
The researchers found that even
though the animals couldn’t see anything, the activity in their visual cortex
was both extensive and shockingly multidimensional, meaning that it was
encoding a great deal of information. Not only were the neurons
chatting, but “there were many conversations going on at the same time,”
wrote Marius Pachitariu, a neuroscientist at the Howard Hughes
Medical Institute’s Janelia Research Campus in Virginia, and a former
postdoctoral researcher in the Cortexlab.
At first, the scientists weren’t
sure what to make of it. So they tried to explain the “conversations” by
relating the brain activity to exactly what the mice were doing at each moment.
They took videos of the face of each mouse, analyzing its motion frame by frame
not just for single facets of behavior like running speed or pupil diameter,
but for anything at all that might explain the neural variability, down to the
tiniest tics and twitches.
Those minor behaviors turned out
to account for at least one-third of the ongoing activity in
the mice’s visual cortex — activity that had previously all been chalked up to mere
noise. It was roughly comparable to the activity that an actual visual input
would typically cause. “We have this part of the brain called the visual cortex, and
you would think that what it does is vision,” Harris said. “And it does do
that. But at least as much of its activity has nothing to do with vision.”
“If we look at the mouse as a
whole,” McCormick said, “all of a sudden, that general activity, that swirling
kaleidoscope of activity in the brain, starts to make sense.” (He and his lab
reported similar findings in a
recent preprint.) The activity didn’t just reflect the general state of
the mouse’s alertness or arousal, or the fact that the animal was moving. The
visual cortex knew exactly what the animal was doing, down to the details of
its individual movements.
When
researchers studied the spontaneous activity of more than 10,000 neurons in the
visual cortex of mice, they were surprised to find it to be rich with
information about the animals’ seemingly irrelevant movements. In these images
from the experiment, neurons flash when they send signals. Each panel monitors
a different depth of tissue in the cortex.
Courtesy of Marius Pachitariu and
Carsen Stringer
In fact, this wasn’t unique to
the visual cortex. “Everywhere in the brain, it’s the same story. The movement
signals are just really unmistakable,” said Matt Smear,
a systems neuroscientist at the University of Oregon who did not participate in
the study. It cements the idea that “certain intuitive notions about the
brain are probably wrong.”
Even more striking, the same
neurons that encoded sensory or other functional information were the ones
explicitly encoding these motor signals. “All of a sudden we’re saying, ‘Wait —
maybe the brain isn’t noisy. Maybe it’s actually much more precise than we
thought,’” McCormick said.
The Cortexlab’s findings, which
were published in Science in April, demonstrated that
neuroscientists need to rethink how they interpret animals’ neural responses.
(Niell pointed out that a significant amount of variation observed in human
functional MRI studies can also be explained by
random fidgets, rather than noise or anything related to the task under
investigation.) “For instance, every time the mouse was running, we saw this
signal in the neurons right before the mouse would start,” said Carsen Stringer,
a postdoctoral researcher at Janelia who did her doctoral work in the
Cortexlab. “And we thought, ‘Maybe that’s just the mouse thinking to run.’ But
really, it’s the mouse whisking [rhythmically moving its whiskers back and
forth] right before it’s running.”
But then what are these signals
doing, and why do they matter?
Perception as Action
A system in which each neuron
channels information about multiple activities at once might seem unworkably
convoluted, but the Cortexlab team found that the brain can cope with all
that data more easily than we might think. Their analysis revealed that when a
stimulus is shown, the incoming information simply gets added on top of the
movement-related signals that were already present. In a single neuron, those
signals appear jumbled together, impossible to tell apart. But different
neurons might convey the same stimulus but different background behaviors, so
that if enough neurons are recorded together, it becomes possible to tease
vision and movement apart.
The movement signals
therefore aren’t hurting the animal’s ability to process sensory
information about the outside world. But scientists still need to explore
exactly how those signals might help the brain work better. At its core, this
discovery reflects the fact that fundamentally, the brain evolved for action —
that animals have brains to let them move around, and that “perception isn’t
just the external input,” Stringer said. “It’s modulated at least to some
extent by what you’re doing at any given time.”
Carsen
Stringer, a neuroscientist at he Howard Hughes Medical Institute’s Janelia
Research Campus in Virginia.
Matt Staley/Janelia Research
Campus
Sensory information represents
only a small part of what’s needed to truly perceive the environment. “You need
to take into account movement, your body relative to the world, in order to
figure out what’s actually out there,” Niell said.
“We used to think that the brain
analyzed all these things separately and then somehow bound them together,”
McCormick said. “Well, we’re starting to learn that the brain does that mixing
of multisensory and movement binding [earlier] than we previously imagined.”
It’s necessary to know how the
body is moving to contextualize and interpret incoming sensory information. If
you’re running, the visual world flies by, and the visual cortex needs to know
that this is driven by your movement. If you’re circling around a monument, the
visual cortex needs to know that you didn’t see 20 different statues, but the
same statue from 20 different angles. “Where is the stability among this storm
of variance?” McCormick said. “That’s why I think that this recent work is very
interesting and very important, because we’re starting to see where the
stability is.”
“Our brains aren’t just thinking
in our heads. Our brains are interacting with our bodies and the way that we
move through the world,” Niell said. “You think, ‘Oh, I’m just thinking,’ or
‘I’m just seeing.’ You don’t think about the fact that your body is playing a
role in that.” So it makes sense that a mouse might need to integrate movement
signals early on (though exactly how, say, the movement of a whisker helps with
vision remains unclear).
In fact, it may go beyond
that. The integration might help facilitate what’s known as active sensing —
whereby an animal actively coordinates its movement with the information it
wants to sense and find. Smear is currently studying this in olfaction: He and
his colleagues have found that mice synchronize many of their movements to
their sniffing rhythm (their primary means of receiving odor information) with
surprising precision.
Even more intriguing, such
coordination might help with learning.
A Greater Purpose
Harris, Stringer and their
colleagues posit that this integration of sensory and motor information creates
a mental scaffolding within which reinforcement learning can happen: If a
particular action and stimulus in combination correlates with a noteworthy
outcome — say, receiving a reward, or finding oneself in danger — this kind of
dual neural coding could help the animal predict that outcome next time and act
accordingly.
In
each of these brain scans from a mouse, a single movement or behavior — such as
licking, whisker twitching or pupil dilation — accounts for large variations in
neural activity throughout the animal’s cortex. Even areas of the brain with
specialized functions process neural signals about a variety of physical
behaviors or states.
Courtesy of Anne Churchland
Churchland suggests that movement
signals might help the animal learn in even more concrete ways. In September,
Churchland, Simon Musall, a postdoctoral fellow in her laboratory, and
their colleagues published the results of an experiment in which they
monitored brain activity in mice that were performing a task:
The animals had to grab a handle to start a trial, and lick one way
or another to report a decision. Even though they were focused on their goal,
their neural activity continued to erupt into a chorus of voices dedicated to
trivial movements seemingly unrelated to the task at hand. “Most of the
activity we found in the brain had nothing to do with the decision,” Churchland
said. “It was reflecting the movements the animal was making.”
According to Niell, who was not
involved in the study, “what was really striking was that even when an animal
was doing that kind of a task, which we think of as purely vision, all of these
[unrelated] movements … come in to dominate the signal.”
Churchland and her team also
found that as each mouse was trained, its movements locked in more on the task.
At first, for instance, the mouse would move its whiskers randomly, but as it
learned, it would whisk at specific times — when the stimulus was presented,
and when the reward was delivered — even though the act of whisking itself had
nothing to do with the reward or the training involved in the task.
Churchland speculates that
animals may use these types of signals to help them make decisions — that “maybe
for them, this is part of the decision-making process,” she said. “Maybe for
animals, as for humans, part of what it means to think and make decisions is to
move.” She likened it to the ritual a baseball player might perform before
stepping up to bat. “It makes me wonder if these fidgets … serve a greater
purpose.”
Our brains aren’t just thinking
in our heads. Our brains are interacting with our bodies and the way that we
move through the world.
Cris Niell, University of Oregon
“People tend to think of
movements as being separate from cognition — as interfering with cognition,
even,” Churchland said. “We think that, given this work, it might be time to
consider an alternative point of view, that at least for some subjects,
movement is really a part of the cognition.”
Granted, at this point, “some
subjects” mostly means rodents. Scientists are now conducting other experiments
to test whether this kind of integration happens as pervasively in primates
(including humans) and in a similar way.
Nevertheless, researchers agree
that the work heralds a shift in how they conduct their experiments on
perception — namely, it demonstrates that they need to start paying more
attention to behavior, too.
A Surrender of Control
Until now, neuroscientists have
taken a more reductionist route. Much of what we know about neural activity
came first from recordings of anesthetized animals, and later from animals
moving about in a constricted way. The experiments themselves have also been
limited. Niell describes it as the “eye exam” model. “When you go to the
optometrist, you sit there and say: horizontal, vertical, better, worse, E, A,
T,” he said. But that kind of abstract exercise may not be representative of
what we typically do in life. “Our brains did not evolve for us to just sit
there and passively watch something without doing anything about it.”
Even the new work by Harris’ and
Churchland’s teams involved keeping the mouse’s head still to enable readings
from the brain. “If the brain is dominated by movement signals when the animal
can’t move [its head], then what’s it going to look like when the animal can
move?” Smear said.
Scientists are now advocating for
additional approaches that get closer to studying animals performing a natural
behavior, one that’s intuitive without training. Of course, other challenges
come with that: It’s more difficult to determine cause and effect in less
controlled settings, for instance.
1.
Even so, Niell has begun studying
mice that use their eyesight to catch and eat crickets. “It’s something that
the mouse’s brain is wired up to do,” he said, “and also a task where they are
moving, and so therefore they have to integrate their movement with what’s out
there.” He and his colleagues have now found that certain types of
already-discovered neurons serve precise behavioral roles in the capture of
prey.
“What we think of as being weird
or unusual signals,” Niell said, “might start to make sense when you actually
let an animal do what it would normally do, and not train the mice to be like
little humans.”
McCormick agreed. “We had a
very impoverished view of the brain,” he said. “I wouldn’t say we have a
perfect view now, but … we have a richer view that needs to grow.”
Editor’s note (added Nov. 8,
2019): The described work conducted in the Cortexlab and by Churchland’s
group received funding from the Simons Foundation, which also funds this
editorially independent magazine.