Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Geometry of population activity in spiking networks with low-rank structure

View through CrossRef
AbstractRecurrent network models are instrumental in investigating how behaviorally-relevant computations emerge from collective neural dynamics. A recently developed class of models based on low-rank connectivity provides an analytically tractable framework for understanding of how connectivity structure determines the geometry of low-dimensional dynamics and the ensuing computations. Such models however lack some fundamental biological constraints, and in particular represent individual neurons in terms of abstract units that communicate through continuous firing rates rather than discrete action potentials. Here we examine how far the theoretical insights obtained from low-rank rate networks transfer to more biologically plausible networks of spiking neurons. Adding a low-rank structure on top of random excitatory-inhibitory connectivity, we systematically compare the geometry of activity in networks of integrate-and-fire neurons to rate networks with statistically equivalent low-rank connectivity. We show that the mean-field predictions of rate networks allow us to identify low-dimensional dynamics at constant population-average activity in spiking networks, as well as novel non-linear regimes of activity such as out-of-phase oscillations and slow manifolds. We finally exploit these results to directly build spiking networks that perform nonlinear computations.Author summaryBehaviorally relevant information processing is believed to emerge from interactions among neurons forming networks in the brain, and computational modeling is an important approach for understanding this process. Models of neuronal networks have been developed at different levels of detail, with typically a trade off between analytic tractability and biological realism. The relation between network connectivity, dynamics and computations is best understood in abstract models where individual neurons are represented as simplified units with continuous firing activity. Here we examine how far the results obtained in a specific, analytically-tractable class of rate models extend to more biologically realistic spiking networks where neurons interact through discrete action potentials. Our results show that abstract rate models provide accurate predictions for the collective dynamics and the resulting computations in more biologically faithful spiking networks.
Title: Geometry of population activity in spiking networks with low-rank structure
Description:
AbstractRecurrent network models are instrumental in investigating how behaviorally-relevant computations emerge from collective neural dynamics.
A recently developed class of models based on low-rank connectivity provides an analytically tractable framework for understanding of how connectivity structure determines the geometry of low-dimensional dynamics and the ensuing computations.
Such models however lack some fundamental biological constraints, and in particular represent individual neurons in terms of abstract units that communicate through continuous firing rates rather than discrete action potentials.
Here we examine how far the theoretical insights obtained from low-rank rate networks transfer to more biologically plausible networks of spiking neurons.
Adding a low-rank structure on top of random excitatory-inhibitory connectivity, we systematically compare the geometry of activity in networks of integrate-and-fire neurons to rate networks with statistically equivalent low-rank connectivity.
We show that the mean-field predictions of rate networks allow us to identify low-dimensional dynamics at constant population-average activity in spiking networks, as well as novel non-linear regimes of activity such as out-of-phase oscillations and slow manifolds.
We finally exploit these results to directly build spiking networks that perform nonlinear computations.
Author summaryBehaviorally relevant information processing is believed to emerge from interactions among neurons forming networks in the brain, and computational modeling is an important approach for understanding this process.
Models of neuronal networks have been developed at different levels of detail, with typically a trade off between analytic tractability and biological realism.
The relation between network connectivity, dynamics and computations is best understood in abstract models where individual neurons are represented as simplified units with continuous firing activity.
Here we examine how far the results obtained in a specific, analytically-tractable class of rate models extend to more biologically realistic spiking networks where neurons interact through discrete action potentials.
Our results show that abstract rate models provide accurate predictions for the collective dynamics and the resulting computations in more biologically faithful spiking networks.

Related Results

Evaluating the Science to Inform the Physical Activity Guidelines for Americans Midcourse Report
Evaluating the Science to Inform the Physical Activity Guidelines for Americans Midcourse Report
Abstract The Physical Activity Guidelines for Americans (Guidelines) advises older adults to be as active as possible. Yet, despite the well documented benefits of physical a...
Embedding optimization reveals long-lasting history dependence in neural spiking activity
Embedding optimization reveals long-lasting history dependence in neural spiking activity
AbstractInformation processing can leave distinct footprints on the statistics of neural spiking. For example, efficient coding minimizes the statistical dependencies on the spikin...
Autapses enable temporal pattern recognition in spiking neural networks
Autapses enable temporal pattern recognition in spiking neural networks
ABSTRACTMost sensory stimuli are temporal in structure. How action potentials encode the information incoming from sensory stimuli remains one of the central research questions in ...
Adaptive Drop Approaches to Train Spiking-YOLO Network for Traffic Flow Counting
Adaptive Drop Approaches to Train Spiking-YOLO Network for Traffic Flow Counting
Abstract Traffic flow counting is an object detection problem. YOLO (" You Only Look Once ") is a popular object detection network. Spiking-YOLO converts the YOLO network f...
Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological sys...
Spiking neural network with local plasticity and sparse connectivity for audio classification
Spiking neural network with local plasticity and sparse connectivity for audio classification
Purpose. Studying the possibility of implementing a data classification method based on a spiking neural network, which has a low number of connections and is trained based on loca...
A Spiking Visual Neuron for Depth Perceptual Systems
A Spiking Visual Neuron for Depth Perceptual Systems
Abstract The biological visual system encodes information into spikes and processes them parallelly by the neural network, which enables the perception with high throughput...
Interplay between periodic stimulation and GABAergic inhibition in striatal network oscillations
Interplay between periodic stimulation and GABAergic inhibition in striatal network oscillations
AbstractThe network oscillations are ubiquitous across many brain regions. In the basal ganglia, oscillations are also present at many levels and a wide range of characteristic fre...

Back to Top