New research suggests the brain is capable of storing ten times more data than previously believed, data demonstrates this may be in the petabyte range and be comparable to the entire World Wide Web. This new research adds to knowledge on brain energy efficiency and may also help to build more powerful computers which conserve energy more efficiently.
Synapses function to store and also transmit information between the 86 billion neurons in the human brain. These synaptic gaps establish contact between other neurons although no actual physical “connection” takes place as occurs in an electrical wire. When the inflowing concentration of ions within the neuron reaches a certain threshold, a nerve impulse is triggered which causes an electrical difference between the interior and exterior of the cell, the nerve cell then releases neurotransmitters across the synaptic gap and the process repeats in the subsequent cell. This message leaves the axon of the sending neuron and is acknowledged by the dendrites on the receiving neuron. Every neuron shares thousands of these synapses with thousands of other neurons.
How these synapses transform is based on activity patterns and neuromodulators; diffusing through large areas of the nervous system and regulating diverse populations of neurons. These changes to the strength of synapses and the regulation of synaptic connectivity govern the process of learning. The hippocampus may be an important learning and memory centre, has a high level of plasticity and where synaptic processes may be seen to operate.
New research by Bartol and colleagues at the Salk institute examined synapses in hippocampal brain tissue aiming to uncover how the brain achieves powerful computational efficiency on small amounts of energy. The team discovered from analyses of the tissue how synapses exist in many more different forms and sizes than previously thought. Using advanced microscopy and computational algorithms specifically developed to image the connectivity, shape, volume and surface area of the brain tissue at a nanomolecular level, the team found the synapses formed during axon coupling were nearly identical, varying by around 8%, smaller than expected.
The team noticed something far from seen regularly; an axon from one neuron formed two synapses to a single dendrite of a receiving neuron, demonstrating the first neuron may be sending a duplicate message. This duplicity was observed to occur in around 10% of hippocampal tissue and so measuring these particular synapses gained insight into synaptic sizes; previously synapses had only been described as small, medium and large. As memory capacity of an individual neuron depends on the size of a synapse, the earlier 8% difference observed allowed the team to measure how much information may be stored in synapses. These increments in size of synapses allowed the team to calculate 26 categories of synapses may exist.
Bartol lead author said, “[The] data suggest there are 10 times more discrete sizes of synapses than previously thought.” This equates to around 4.7 bits of information per synapse, previously, estimations came to the conclusion of around two bits for memory within the hippocampus. “This is approximately an order of magnitude of precision more than anyone ever imagined,” says Sejnowski. How this efficiency and precision take place within the hippocampus known to have unreliable synapses, also puzzled the team.
A possible answer lies in the constant change of synapses which averages out the success of the firing of a nerve impulse (action potential). For the smallest synapses a staggering 1,500 events trigger a change in the size and functioning of the cell and, for the largest only around two hundred signalling events generate this change. “This means every 2 or 20 minutes, your synapses are going up or down to the next size. The synapses are adjusting themselves according to the signals received,” Said Bartol.
“This is a real bombshell in the field of neuroscience, we discovered the key to unlocking the design principle for how hippocampal neurons function with reduced energy however high computation power,” Said Sejnowski. “This trick of the brain absolutely points to a way to design better computers.”
How does the human brain operate complex tasks on such small amounts of energy?