Skip to main content

Synchronicity - spatio-temporal spiking neuron models


The previous post began with a slogan pertaining to Hebbian learning that was coined by Donald Hebb: "Cells that fire together, wire together". A number of papers have been appearing in recent years that extend this idea further - that pulses that coincide are actually one of the most important ways that the brain transmits information. This concept appears to be a natural consequence of Hebbian learning: the brain adapts its network of synaptic connections by pruning those connections where the incoming signals are not correlated with other signals coming into the neuron and reinforces those where this type of coincidence does occur. It is doing this for a reason - to establish the 'right' set of connections and synaptic weights in order to associate one input or one set of inputs with another. This type of correlation between events has been proposed as being what knowledge itself is made of and as the basis for some of the key aspects of cognition and symbolic thought (ref.).

Not everyone agrees with this idea that the relative timing of spikes is what is used to carry information, however. There are many researchers that focus on trying to understand the 'neural code' that is used by the brain to send information from one neuron to another. Claude Shannon's Information Theory is often used as a mathematical framework to try to determine how many 'bits of data' are sent from one neuron to another and the efficiency of the information transferred from one neuron to the next. Researchers who favour this approach generally model the transmission of spikes from one neuron to the next using "rate coding" (a.k.a. "frequency coding") models. These models are based on the idea that neurons will only reach the threshold needed to generate a spike and send it to the next neuron when the number of spikes it receives exceeds in a given time period exceeds some value.

Although information theory has proven to be a useful way to gain insight into the lower levels of vision processing (e.g. Ralph Linsker's Infomax principle), it doesn't seem to me that constraining the theory of how neurons interact so that it can be described using the math of Information Theory is helpful - the concept of 'how much information' is transferred from one neuron to the next is determined primarily by which neuron the spike is sent to; information theory is not the right tool to approach this with. Using the mathematical framework of information theory narrows the focus down to things like the frequency of the spikes that are sent from one neuron to the next, and researchers end up making the wrong type of simplifications in order to achieve this - things like combining both the efficiency of the synapse and the 'frequency' of the incoming spikes into a single 'synaptic weight', and ignoring the idea that the network of neurons is constructed in such a way as to enable coincidence detection. A different way of measuring information is needed - one that takes into account the 100's, 1000s or 10,000s of connections each neuron can send that spike to, whether the spike is used to inhibit or excite the downstream neuron, the topology of the connections between groups of neurons, the fact that neurons can rewire themselves dynamically, the role that neurotransmitters play, etc.

The rate coding model aligns well with the popular Computer Science approaches to implementing neural networks in software - time does not play a significant role in the way these models operate. Training data is used to ensure that the weights converge to whatever is required in order to map the input data to the expected output data in the training set. What these software models completely ignore is the idea that the input signals in the brain are sent as neuronal spikes, and that the correlation between spikes is what causes the synapses to either grow stronger or weaker. Rate coding attempts to bridge this gap by noting that spikes from a neuron often are generated in a repetitive series - a spike train - and that the frequency of these spikes will tend to drive the receiving neuron beyond its signaling threshold. The relative density of spikes over a given time interval, the thinking goes, is what matters; the timing of the individual spikes is not important. Rate coding also glosses over the fact that neurons do not have spikes arriving at a constant rate - they arrive sporadically and in bursts (ref.)

The rank coding model (a.k.a. Delay coding), on the other hand, is based on the idea that sensory neurons (e.g. in the retina and inner ear) will respond to more energetic input signals by generating a spike earlier, that this generated spike will arrive at the downstream neuron earlier than other spikes and will thus have a higher impact or 'ranking' relative to later incoming spikes as a result. The rank coding model notes that there are some specific examples where the only way the brain can respond quickly enough to an incoming stimulus (e.g. a noise or an image) is if a single neuron were to respond to the initial spike that was sent from the neuron in the eye or ear. Software models of rank-encoding methods and "liquid state machines" are starting to appear (e.g. SpikeNET Technology ) which offer capabilities that outperform standard software neural networks for certain appliations. These are the leading edge of a "third generation" of software models of neural networks, which look very promising.

From Networks of Spiking Neurons: A New Generation of Neural Network Models by Thomas Natschläger (December 1998):

The First Generation of Models
If one wants to understand how the nervous system computes a function one has to think about how information about the environment or internal states is represented and transmitted. The fact that the shape of an action potential is always the same one can exclude the possibility that the voltage trajectory of an action potential carries relevant information. Thus a central question in the field of neuroscience is how neurons encode information in the sequence of action potential they emit. In this article we characterize neural network models by the assumptions about the encoding scheme.

1943 McCulloch and Pitts proposed the first neuron model: the threshold gate. The characteristics of their model was that they treated a neuron as a binary device. That is they distinguished only between the occurrence and absence of a spike. The threshold gate is used as building block for various network types including multilayer perceptrons, Hopfield networks and the Boltzman machine. It turned out that the threshold gate is a computational powerful device. That is one can compute complex functions with rather small networks made up of threshold gates. From the theoretical point of view the threshold gate is a very interesting model but it is unlikely that real biological systems use such a binary encoding scheme. A prerequisite for such a binary coding scheme is a kind of global clocking mechanism but it is very unlikely that such a mechanism exists in biological systems.

The Second Generation
Another possibility is that the number of spikes per second (called the firing rate) encodes relevant information. This idea lead to a model neuron known as sigmoidal gate. The output of a sigmoidal gate is a number which is thought to represent the firing rate of the neuron. There exists a huge amount of literature which discusses in detail all aspects of this kind of neural network models. We just want to note that networks of sigmoidal gates can in principle compute any analog function and that along with this type of models the question of learning in neural networks was intensively investigated for the first time.

...
The Third Generation: Networks of Spiking Neurons (SNN)
...[Results] from experimental neurobiology gave rise to a new class of neural network models where one also incorporates the timing of individual spikes. Thus time plays a central role in SNNs whereas in most other neural network models there is even no notion of time.


The standard Computer Science neural networks are based on the second generation of models, and have been found to be useful in applications including text recognition, speech recognition and stock market prediction. Software implementations of the third generation of models are starting to appear. They are based on more accurate computer representations of biological neural networks and will hopefully open up new types of software applications in areas such as machine vision and robotics.

References


Rate Codes and Shannon's Information Theory
Information theory and neural coding - Alexander Borst and Frédéric E. Theunissen
Energy-efficient interspike interval codes
Neural coding and decoding: communication channels and quantization
Introduction: statistical and machine learning based approaches to neurobiology - Shin Ishii
Nara Institute of Science and Technology
Information Theory and Systems Neuroscience
Entropy as an Index of the Informational State of Neurons

Rank Coding and Temporal Coding
SpikeNET - Scientific papers by Simon Thorpe & colleagues
Publications related to SpikeNET
The Neural Basis of Temporal Processing
Temporal Coding and Analysis of Spike Sequences
Synfire Chains and Cortical Songs: Temporal Modules of Cortical Activity

Spike-based Neural Network Models
Networks of Spiking Neurons: A New Generation of Neural Network Models - Thomas Natschläger
Computing with spikes - Wolfgang Maass

Comments

Popular posts from this blog

Perkinjes and Granules and Schwanns, oh my...

It's tempting to oversimplify things.  Like neurons.  It would be nice if there were one type of neuron, and all you needed to know about how neurons work could be clearly labelled on a diagram of that one type of neuron.  Well, nature LOVES to specialize.  So, before getting deeper into how neurons work, I thought it would be good to take a step back and get some vocabulary in place...   The Basics From University of Washington's 'Neuroscience for kids':   Neurons come in many different shapes and sizes. Some of the smallest neurons have cell bodies that are only 4 microns wide. Some of the biggest neurons have cell bodies that are 100 microns wide.  Neurons are similar to other cells in the body because: Neurons are surrounded by a cell membrane. Neurons have a nucleus that contains genes. Neurons contain cytoplasm, mitochondria and other "organelles" . Neurons carry out basic cellular processes such as protein synthesis and energy production. Howe

Actin Lessons Part II: Memorabilia

Recall from the previous post, that when a neuron's axon fires repeatedly the relevant genes (in that neuron) turn on, and the synapses that are holding the short-term memory when the synapse strengthening proteins find them, become, in effect, tattooed (from Making Memories Stick by R. Douglas Fields) It appears that this 'tattooing' process involves enzymes that cause actin to change the shape of the synapse, broadening it so that more receptors can be brought into play. Much progress has been made in the past 10 years or so to understand the details of what is going on. From ScienceDaily (Jun. 14, 2004) : Neuroscientists at the Picower Center for Learning and Memory at MIT show for the first time that storage of long-term memories depends on the size and shape of synapses among neurons in the outer part of the brain, the cerebral cortex. ... When an experience or a fact is repeated enough or elicits a powerful emotional response, it shifts from short- to long-term m

A new type of mathematics

From  TEDxMontreal:  http://tedxtalks.ted.com/video/TEDxMontreal-David-Dalrymple-A More... John von Neumann: The Computer and the Brain  Nature article on 2-photon microscopy: Visualizing hippocampal neurons with in vivo two-photon microscopy using a 1030 nm picosecond pulse   (January, 2013 - free online access) by Ryosuke Kawakami, Kazuaki Sawada, Aya Sato, Terumasa Hibi, Yuichi Kozawa, Shunichi Sato, Hiroyuki Yokoyama & Tomomi Nemoto Singularity University: http://singularityu.org/ - David Dalrymple's antidisciplinary, non-institutional science and technology project for digital replication of the functionality (“mind”) of simple nervous systems (“brain”)