Javascript must be enabled to continue!
Biologically Realistic Computational Primitives of Neocortex Implemented on Neuromorphic Hardware Improve Vision Transformer Performance
View through CrossRef
Understanding the computational principles of the brain and replicating them on neuromorphic hardware and modern deep learning architectures is crucial for advancing neuro-inspired AI (NeuroAI). Here, we develop an experimentally-constrained biophysical network model of neocortical circuit motifs, focusing on layers 2-3 of the primary visual cortex (V1). We investigate the role of four major cortical interneuron classes in a competitive-cooperative computational primitive and validate these circuit motifs implemented soft winner-take-all (sWTA) computation for gain modulation, signal restoration, and context-dependent multistability. Using a novel parameter mapping technique, we configured IBM’s TrueNorth (TN) chip to implement sWTA computations, mirroring biological neural dynamics. Retrospectively, we observed a strong correspondence between the biophysical model and the TN hardware parameters, particularly in the roles of four key inhibitory neuron classes: Parvalbumin (feedforward inhibition), Somatostatin (feedback inhibition), VIP (disinhibition), and LAMP5 (gain normalization). Moreover, the sparse coupling of this sWTA motif was also able to simulate a two-state neural state machine on the TN chip, replicating working memory dynamics essential for cognitive tasks. Additionally, integrating the sWTA computation as a preprocessing layer in the Vision Transformer (ViT) enhanced its performance on the MNIST digit classification task, demonstrating improved generalization to previously unseen data and suggesting a mechanism akin to zero-shot learning. Our approach provides a framework for translating brain-inspired computations to neuromorphic hardware, with potential applications on platforms like Intel’s Loihi2 and IBM’s Northpole. By integrating biophysically accurate models with neuromorphic hardware and advanced machine learning techniques, we offer a comprehensive roadmap for embedding neural computation into NeuroAI systems.
Cold Spring Harbor Laboratory
Title: Biologically Realistic Computational Primitives of Neocortex Implemented on Neuromorphic Hardware Improve Vision Transformer Performance
Description:
Understanding the computational principles of the brain and replicating them on neuromorphic hardware and modern deep learning architectures is crucial for advancing neuro-inspired AI (NeuroAI).
Here, we develop an experimentally-constrained biophysical network model of neocortical circuit motifs, focusing on layers 2-3 of the primary visual cortex (V1).
We investigate the role of four major cortical interneuron classes in a competitive-cooperative computational primitive and validate these circuit motifs implemented soft winner-take-all (sWTA) computation for gain modulation, signal restoration, and context-dependent multistability.
Using a novel parameter mapping technique, we configured IBM’s TrueNorth (TN) chip to implement sWTA computations, mirroring biological neural dynamics.
Retrospectively, we observed a strong correspondence between the biophysical model and the TN hardware parameters, particularly in the roles of four key inhibitory neuron classes: Parvalbumin (feedforward inhibition), Somatostatin (feedback inhibition), VIP (disinhibition), and LAMP5 (gain normalization).
Moreover, the sparse coupling of this sWTA motif was also able to simulate a two-state neural state machine on the TN chip, replicating working memory dynamics essential for cognitive tasks.
Additionally, integrating the sWTA computation as a preprocessing layer in the Vision Transformer (ViT) enhanced its performance on the MNIST digit classification task, demonstrating improved generalization to previously unseen data and suggesting a mechanism akin to zero-shot learning.
Our approach provides a framework for translating brain-inspired computations to neuromorphic hardware, with potential applications on platforms like Intel’s Loihi2 and IBM’s Northpole.
By integrating biophysically accurate models with neuromorphic hardware and advanced machine learning techniques, we offer a comprehensive roadmap for embedding neural computation into NeuroAI systems.
Related Results
Automatic Load Sharing of Transformer
Automatic Load Sharing of Transformer
Transformer plays a major role in the power system. It works 24 hours a day and provides power to the load. The transformer is excessive full, its windings are overheated which lea...
Robust analogue neuromorphic hardware networks using intrinsic physics-adaptive learning
Robust analogue neuromorphic hardware networks using intrinsic physics-adaptive learning
Abstract
Analogue neuromorphic computing hardware is highly energy-efficient and has been regarded as one of the most promising technologies for advancing artificial intell...
ANALISIS PENGARUH MASA OPERASIONAL TERHADAP PENURUNAN KAPASITAS TRANSFORMATOR DISTRIBUSI DI PT PLN (PERSERO)
ANALISIS PENGARUH MASA OPERASIONAL TERHADAP PENURUNAN KAPASITAS TRANSFORMATOR DISTRIBUSI DI PT PLN (PERSERO)
One cause the interruption of transformer is loading that exceeds the capabilities of the transformer. The state of continuous overload will affect the age of the transformer and r...
LIFE CYCLE OF TRANSFORMER 110/X KV AND ITS VALUE
LIFE CYCLE OF TRANSFORMER 110/X KV AND ITS VALUE
In a deregulated environment, power companies are in the constant process of reducing the costs of operating power facilities, with the aim of optimally improving the quality of de...
Performance simulation methodologies for hardware/software co-designed processors
Performance simulation methodologies for hardware/software co-designed processors
Recently the community started looking into Hardware/Software (HW/SW) co-designed processors as potential solutions to move towards the less power consuming and the less complex de...
PLC Based Load Sharing of Transformers
PLC Based Load Sharing of Transformers
The transformer is very expensive and bulky power system equipment. It runs and feed the load for 24 hours a day. Sometimes the load on the transformer unexpectedly rises above its...
Depth-aware salient object segmentation
Depth-aware salient object segmentation
Object segmentation is an important task which is widely employed in many computer vision applications such as object detection, tracking, recognition, and ret...
Simulation modeling study on short circuit ability of distribution transformer
Simulation modeling study on short circuit ability of distribution transformer
Abstract
Under short circuit condition, the oil immersed distribution transformer will endure combined electro-thermal stress, eventually lead to the mechanical dama...

