Javascript must be enabled to continue!
Redundancy-Aware Pruning of Convolutional Neural Networks
View through CrossRef
Pruning is an effective way to slim and speed up convolutional neural networks. Generally previous work directly pruned neural networks in the original feature space without considering the correlation of neurons. We argue that such a way of pruning still keeps some redundancy in the pruned networks. In this letter, we proposed to prune in the intermediate space in which the correlation of neurons is eliminated. To achieve this goal, the input and output of a convolutional layer are first mapped to an intermediate space by orthogonal transformation. Then neurons are evaluated and pruned in the intermediate space. Extensive experiments have shown that our redundancy-aware pruning method surpasses state-of-the-art pruning methods on both efficiency and accuracy. Notably, using our redundancy-aware pruning method, ResNet models with three times the speed-up could achieve competitive performance with fewer floating point operations per second even compared to DenseNet.
Title: Redundancy-Aware Pruning of Convolutional Neural Networks
Description:
Pruning is an effective way to slim and speed up convolutional neural networks.
Generally previous work directly pruned neural networks in the original feature space without considering the correlation of neurons.
We argue that such a way of pruning still keeps some redundancy in the pruned networks.
In this letter, we proposed to prune in the intermediate space in which the correlation of neurons is eliminated.
To achieve this goal, the input and output of a convolutional layer are first mapped to an intermediate space by orthogonal transformation.
Then neurons are evaluated and pruned in the intermediate space.
Extensive experiments have shown that our redundancy-aware pruning method surpasses state-of-the-art pruning methods on both efficiency and accuracy.
Notably, using our redundancy-aware pruning method, ResNet models with three times the speed-up could achieve competitive performance with fewer floating point operations per second even compared to DenseNet.
Related Results
Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks
Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory...
Noise Robust Projection Rule for Klein Hopfield Neural Networks
Noise Robust Projection Rule for Klein Hopfield Neural Networks
Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural net...
How Convolutional Neural Networks Diagnose Plant Disease
How Convolutional Neural Networks Diagnose Plant Disease
Deep learning with convolutional neural networks (CNNs) has achieved great success in the classification of various plant diseases. However, a limited number of studies have elucid...
Synaptic Dynamics in Analog VLSI
Synaptic Dynamics in Analog VLSI
Synapses are crucial elements for computation and information transfer in both real and artificial neural systems. Recent experimental findings and theoretical models of pulse-base...
Convolutional Neural Networks for Image-Based High-Throughput Plant Phenotyping: A Review
Convolutional Neural Networks for Image-Based High-Throughput Plant Phenotyping: A Review
Plant phenotyping has been recognized as a bottleneck for improving the efficiency of breeding programs, understanding plant-environment interactions, and managing agricultural sys...
A Neural Model of Olfactory Sensory Memory in the Honeybee's Antennal Lobe
A Neural Model of Olfactory Sensory Memory in the Honeybee's Antennal Lobe
We present a neural model for olfactory sensory memory in the honeybee's antennal lobe. To investigate the neural mechanisms underlying odor discrimination and memorization, we exp...
Detection of whale calls in noise: Performance comparison between a beluga whale, human listeners, and a neural network
Detection of whale calls in noise: Performance comparison between a beluga whale, human listeners, and a neural network
This article examines the masking by anthropogenic noise of beluga whale calls. Results from human masking experiments and a software backpropagation neural network are compared to...
Unsupervised Learning
Unsupervised Learning
What use can the brain make of the massive flow of sensory information that occurs without any associated rewards or punishments? This question is reviewed in the light of connecti...