Data
from the Salk Institute shows brain’s memory capacity is in the petabyte range,
as much as entire Web
Newswise,
January 27, 2016—Salk researchers and collaborators have achieved critical
insight into the size of neural connections, putting the memory capacity of the
brain far higher than common estimates.
The
new work also answers a longstanding question as to how the brain is so energy
efficient and could help engineers build computers that are incredibly powerful
but also conserve energy.
“This
is a real bombshell in the field of neuroscience,” says Terry Sejnowski, Salk
professor and co-senior author of the paper, which was published in eLife.
“We
discovered the key to unlocking the design principle for how hippocampal
neurons function with low energy but high computation power.
“Our
new measurements of the brain’s memory capacity increase conservative estimates
by a factor of 10 to at least a petabyte, in the same ballpark as the World
Wide Web.”
Our
memories and thoughts are the result of patterns of electrical and chemical
activity in the brain.
A key part of the activity happens when
branches of neurons, much like electrical wire, interact at certain junctions,
known as synapses. An output ‘wire’ (an axon) from one neuron connects to an
input ‘wire’ (a dendrite) of a second neuron.
Signals
travel across the synapse as chemicals called neurotransmitters to tell the
receiving neuron whether to convey an electrical signal to other neurons. Each
neuron can have thousands of these synapses with thousands of other neurons.
“When
we first reconstructed every dendrite, axon, glial process, and synapse from a
volume of hippocampus the size of a single red blood cell, we were somewhat
bewildered by the complexity and diversity amongst the synapses,” says Kristen
Harris, co-senior author of the work and professor of neuroscience at the
University of Texas, Austin.
“While I had hoped to learn fundamental principles
about how the brain is organized from these detailed reconstructions, I have
been truly amazed at the precision obtained in the analyses of this report.”
Synapses
are still a mystery, though their dysfunction can cause a range of neurological
diseases. Larger synapses—with more surface area and vesicles of
neurotransmitters—are stronger, making them more likely to activate their
surrounding neurons than medium or small synapses.
The
Salk team, while building a 3D reconstruction of rat hippocampus tissue (the
memory center of the brain), noticed something unusual.
In
some cases, a single axon from one neuron formed two synapses reaching out to a
single dendrite of a second neuron, signifying that the first neuron seemed to
be sending a duplicate message to the receiving neuron.
At
first, the researchers didn’t think much of this duplicity, which occurs about
10 percent of the time in the hippocampus.
But
Tom Bartol, a Salk staff scientist, had an idea: if they could measure the
difference between two very similar synapses such as these, they might glean
insight into synaptic sizes, which so far had only been classified in the field
as small, medium and large.
To
do this, researchers used advanced microscopy and computational algorithms they
had developed to image rat brains and reconstruct the connectivity, shapes,
volumes and surface area of the brain tissue down to a nanomolecular level.
The
scientists expected the synapses would be roughly similar in size, but were
surprised to discover the synapses were nearly identical.
“We
were amazed to find that the difference in the sizes of the pairs of synapses
were very small, on average, only about eight percent different in size. No one
thought it would be such a small difference. This was a curveball from nature,”
says Bartol.
Because
the memory capacity of neurons is dependent upon synapse size, this eight
percent difference turned out to be a key number the team could then plug into
their algorithmic models of the brain to measure how much information could
potentially be stored in synaptic connections.
It
was known before that the range in sizes between the smallest and largest
synapses was a factor of 60 and that most are small.
But
armed with the knowledge that synapses of all sizes could vary in increments as
little as eight percent between sizes within a factor of 60, the team
determined there could be about 26 categories of sizes of synapses, rather than
just a few.
“Our
data suggests there are 10 times more discrete sizes of synapses than
previously thought,” says Bartol. In computer terms, 26 sizes of synapses
correspond to about 4.7 “bits” of information.
Previously,
it was thought that the brain was capable of just one to two bits for short and
long memory storage in the hippocampus.
“This
is roughly an order of magnitude of precision more than anyone has ever
imagined,” says Sejnowski.
What
makes this precision puzzling is that hippocampal synapses are notoriously
unreliable. When a signal travels from one neuron to another, it typically
activates that second neuron only 10 to 20 percent of the time.
“We
had often wondered how the remarkable precision of the brain can come out of
such unreliable synapses,” says Bartol.
One
answer, it seems, is in the constant adjustment of synapses, averaging out
their success and failure rates over time. The team used their new data and a
statistical model to find out how many signals it would take a pair of synapses
to get to that eight percent difference.
The
researchers calculated that for the smallest synapses, about 1,500 events cause
a change in their size/ability (20 minutes) and for the largest synapses, only
a couple hundred signaling events (1 to 2 minutes) cause a change.
“This
means that every 2 or 20 minutes, your synapses are going up or down to the
next size. The synapses are adjusting themselves according to the signals they
receive,” says Bartol.
“Our
prior work had hinted at the possibility that spines and axons that synapse
together would be similar in size, but the reality of the precision is truly
remarkable and lays the foundation for whole new ways to think about brains and
computers,” says Harris.
“The
work resulting from this collaboration has opened a new chapter in the search
for learning and memory mechanisms.” Harris adds that the findings suggest more
questions to explore, for example, if similar rules apply for synapses in other
regions of the brain and how those rules differ during development and as
synapses change during the initial stages of learning.
“The
implications of what we found are far-reaching,” adds Sejnowski. “Hidden under
the apparent chaos and messiness of the brain is an underlying precision to the
size and shapes of synapses that was hidden from us.”
The
findings also offer a valuable explanation for the brain’s surprising
efficiency. The waking adult brain generates only about 20 watts of continuous
power—as much as a very dim light bulb.
The
Salk discovery could help computer scientists build ultraprecise, but
energy-efficient, computers, particularly ones that employ “deep learning” and
artificial neural nets—techniques capable of sophisticated learning and
analysis, such as speech, object recognition and translation.
“This
trick of the brain absolutely points to a way to design better computers,” says
Sejnowski. “Using probabilistic transmission turns out to be as accurate and
require much less energy for both computers and brains.”
Other
authors on the paper were Cailey Bromer of the Salk Institute; Justin Kinney of
the McGovern Institute for Brain Research; and Michael A. Chirillo and Jennifer
N. Bourne of the University of Texas, Austin.
The
work was supported by the NIH and the Howard Hughes Medical Institute.
No comments:
Post a Comment