Javascript must be enabled to continue!
Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
View through CrossRef
The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularization (BPSR), aiming to achieve improved spiking and synaptic sparsity. Backpropagation incorporating spiking regularization is utilized to minimize the spiking firing rate with guaranteed accuracy. Backpropagation realizes the temporal information capture and extends to the spiking recurrent layer to support brain-like structure learning. The rewiring mechanism with synaptic regularization is suggested to further mitigate the redundancy of the network structure. Rewiring based on weight and gradient regulates the pruning and growth of synapses. Experimental results demonstrate that the network learned by BPSR has synaptic sparsity and is highly similar to the biological system. It not only balances the accuracy and firing rate, but also facilitates SNN learning by suppressing the information redundancy. We evaluate the proposed BPSR on the visual dataset MNIST, N-MNIST, and CIFAR10, and further test it on the sensor dataset MIT-BIH and gas sensor. Results bespeak that our algorithm achieves comparable or superior accuracy compared to related works, with sparse spikes and synapses.
Frontiers Media SA
Title: Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
Description:
The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems.
This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularization (BPSR), aiming to achieve improved spiking and synaptic sparsity.
Backpropagation incorporating spiking regularization is utilized to minimize the spiking firing rate with guaranteed accuracy.
Backpropagation realizes the temporal information capture and extends to the spiking recurrent layer to support brain-like structure learning.
The rewiring mechanism with synaptic regularization is suggested to further mitigate the redundancy of the network structure.
Rewiring based on weight and gradient regulates the pruning and growth of synapses.
Experimental results demonstrate that the network learned by BPSR has synaptic sparsity and is highly similar to the biological system.
It not only balances the accuracy and firing rate, but also facilitates SNN learning by suppressing the information redundancy.
We evaluate the proposed BPSR on the visual dataset MNIST, N-MNIST, and CIFAR10, and further test it on the sensor dataset MIT-BIH and gas sensor.
Results bespeak that our algorithm achieves comparable or superior accuracy compared to related works, with sparse spikes and synapses.
Related Results
Embedding optimization reveals long-lasting history dependence in neural spiking activity
Embedding optimization reveals long-lasting history dependence in neural spiking activity
AbstractInformation processing can leave distinct footprints on the statistics of neural spiking. For example, efficient coding minimizes the statistical dependencies on the spikin...
Spiking neural network with local plasticity and sparse connectivity for audio classification
Spiking neural network with local plasticity and sparse connectivity for audio classification
Purpose. Studying the possibility of implementing a data classification method based on a spiking neural network, which has a low number of connections and is trained based on loca...
Adaptive Drop Approaches to Train Spiking-YOLO Network for Traffic Flow Counting
Adaptive Drop Approaches to Train Spiking-YOLO Network for Traffic Flow Counting
Abstract
Traffic flow counting is an object detection problem. YOLO (" You Only Look Once ") is a popular object detection network. Spiking-YOLO converts the YOLO network f...
A Comparative Study of SSA-BPNN, SSA-ENN, and SSA-SVR Models for Predicting the Thickness of an Excavation Damaged Zone around the Roadway in Rock
A Comparative Study of SSA-BPNN, SSA-ENN, and SSA-SVR Models for Predicting the Thickness of an Excavation Damaged Zone around the Roadway in Rock
Due to the disturbance effect of excavation, the original stress is redistributed, resulting in an excavation damaged zone around the roadway. It is significant to predict the thic...
Autapses enable temporal pattern recognition in spiking neural networks
Autapses enable temporal pattern recognition in spiking neural networks
ABSTRACTMost sensory stimuli are temporal in structure. How action potentials encode the information incoming from sensory stimuli remains one of the central research questions in ...
APPLICATION OF INTELLIGENT SYSTEM WITH BACKPROPAGATION MODEL IN CLOUD IMAGE CLASSIFICATION
APPLICATION OF INTELLIGENT SYSTEM WITH BACKPROPAGATION MODEL IN CLOUD IMAGE CLASSIFICATION
The clouds have different patterns on each type and each type has different properties. The introduction of the type, shape, and nature of the cloud is indispensable in the weather...
Deep convolutional neural network and IoT technology for healthcare
Deep convolutional neural network and IoT technology for healthcare
Background Deep Learning is an AI technology that trains computers to analyze data in an approach similar to the human brain. Deep learning algorithms can find complex patterns in ...
Analytical Solutions to Minimum-Norm Problems
Analytical Solutions to Minimum-Norm Problems
For G∈Rm×n and g∈Rm, the minimization min∥Gψ−g∥2, with ψ∈Rn, is known as the Tykhonov regularization. We transport the Tykhonov regularization to an infinite-dimensional setting, t...

