Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

On the role of network dynamics for information processing in artificial and biological neural networks

View through CrossRef
Understanding how interactions in complex systems give rise to various collective behaviours has been of interest for researchers across a wide range of fields. However, despite many years of studying the complexity of interactions in such systems, there are still many open research questions. Recurrent neural networks (RNNs) are one example of such complex systems, where our knowledge is not complete, and thus interpretability is limited. In particular, the understanding of dynamic information processing in RNNs is of great interest, as it has direct application in many fields, from biology to computer science. In this work, we study the information processing capabilities of such neural networks using a dynamical systems approach. First, we review the recent progress that has been achieved in the field of deep learning, particularly in design of artificial neural networks (ANNs) such as feedforward neural networks (FNNs) and recurrent neural networks. Next, we use dynamical system analysis and show how Lyapunov exponents can be utilized to characterize fading memory in recurrent neural networks and analyze the impact which memory properties have on learning dynamics. Our results align well with recent literature, demonstrating that a dynamical systems approach can be successfully used to study properties of RNNs’ dynamics and their learning dynamics such as stability and convergence. Subsequently, we focus on the inductive biases of RNNs, in particular residual connections, a type of inductive bias often used in deep learning. We introduce a new class of RNNs, the so-called weakly coupled residual networks (WCRNN), and demonstrate how these networks can be equipped with different fading memory characteristics through the configuration of their residual connections. We experimentally investigate the dynamics and performance of these networks on benchmark datasets and theoretically via dynamical system analysis. We confirm theoretically predicted trade-off between performance and learning speed at criticality and explore which of heterogeneous, rotational and non-linear residuals result in best practical expressivity. Our results uncover the essential role that inductive biases play for the networks’ practical expressivity, in agreement with recent developments in design of recurrent neural networks. Next, we discuss the role of complex neural dynamics for information processing in the brain and discuss existing biophysical and mechanistic models that are commonly used to model such neural dynamics on different scales. We also discuss how phenomenological models can be utilized to study the feasibility of coding strategies, such as strategies based on multistable attractors and metastable states. We note that despite their complexity, many detailed mechanistic models of neural dynamics have not been shown to perform meaningful information processing, and we present arguments in favor of studying computationally powerful phenomenological models of neural dynamics, thereby motivating our study in next chapter. In the last chapter, we follow the proposed approach and investigate the dynamics of cortical circuits by incorporating general principles observed in experimental studies into computationally powerful recurrent networks. Motivated by recent advances in the field of deep learning, we introduce an RNN model composed of coupled harmonic oscillators (HORN) that is able to capture characteristic properties of neural dynamics. Our results support the idea that characteristic properties of neural dynamics as oscillations, synchronization, resonance, heterogeneity play a functional role in forming complex patterns crucial for information processing in the brain. Our experiments with the HORN model also suggest that the brain can potentially utilize metastable dynamics for efficient neural coding and associative learning. The results obtained with the biologically-inspired HORN model are in good agreement with our experiments on WCRNNs, showing the universal advantage of informed inductive biases. We believe that our findings present a valuable contribution to the long-standing debate on the functional role of the considered properties of neural dynamics. Overall, we think that our approach based on a dynamical systems analysis offers valuable insights into information processing in both artificial and biological recurrent neural networks.
University Library J. C. Senckenberg
Title: On the role of network dynamics for information processing in artificial and biological neural networks
Description:
Understanding how interactions in complex systems give rise to various collective behaviours has been of interest for researchers across a wide range of fields.
However, despite many years of studying the complexity of interactions in such systems, there are still many open research questions.
Recurrent neural networks (RNNs) are one example of such complex systems, where our knowledge is not complete, and thus interpretability is limited.
In particular, the understanding of dynamic information processing in RNNs is of great interest, as it has direct application in many fields, from biology to computer science.
In this work, we study the information processing capabilities of such neural networks using a dynamical systems approach.
First, we review the recent progress that has been achieved in the field of deep learning, particularly in design of artificial neural networks (ANNs) such as feedforward neural networks (FNNs) and recurrent neural networks.
Next, we use dynamical system analysis and show how Lyapunov exponents can be utilized to characterize fading memory in recurrent neural networks and analyze the impact which memory properties have on learning dynamics.
Our results align well with recent literature, demonstrating that a dynamical systems approach can be successfully used to study properties of RNNs’ dynamics and their learning dynamics such as stability and convergence.
Subsequently, we focus on the inductive biases of RNNs, in particular residual connections, a type of inductive bias often used in deep learning.
We introduce a new class of RNNs, the so-called weakly coupled residual networks (WCRNN), and demonstrate how these networks can be equipped with different fading memory characteristics through the configuration of their residual connections.
We experimentally investigate the dynamics and performance of these networks on benchmark datasets and theoretically via dynamical system analysis.
We confirm theoretically predicted trade-off between performance and learning speed at criticality and explore which of heterogeneous, rotational and non-linear residuals result in best practical expressivity.
Our results uncover the essential role that inductive biases play for the networks’ practical expressivity, in agreement with recent developments in design of recurrent neural networks.
Next, we discuss the role of complex neural dynamics for information processing in the brain and discuss existing biophysical and mechanistic models that are commonly used to model such neural dynamics on different scales.
We also discuss how phenomenological models can be utilized to study the feasibility of coding strategies, such as strategies based on multistable attractors and metastable states.
We note that despite their complexity, many detailed mechanistic models of neural dynamics have not been shown to perform meaningful information processing, and we present arguments in favor of studying computationally powerful phenomenological models of neural dynamics, thereby motivating our study in next chapter.
In the last chapter, we follow the proposed approach and investigate the dynamics of cortical circuits by incorporating general principles observed in experimental studies into computationally powerful recurrent networks.
Motivated by recent advances in the field of deep learning, we introduce an RNN model composed of coupled harmonic oscillators (HORN) that is able to capture characteristic properties of neural dynamics.
Our results support the idea that characteristic properties of neural dynamics as oscillations, synchronization, resonance, heterogeneity play a functional role in forming complex patterns crucial for information processing in the brain.
Our experiments with the HORN model also suggest that the brain can potentially utilize metastable dynamics for efficient neural coding and associative learning.
The results obtained with the biologically-inspired HORN model are in good agreement with our experiments on WCRNNs, showing the universal advantage of informed inductive biases.
We believe that our findings present a valuable contribution to the long-standing debate on the functional role of the considered properties of neural dynamics.
Overall, we think that our approach based on a dynamical systems analysis offers valuable insights into information processing in both artificial and biological recurrent neural networks.

Related Results

Cometary Physics Laboratory: spectrophotometric experiments
Cometary Physics Laboratory: spectrophotometric experiments
<p><strong><span dir="ltr" role="presentation">1. Introduction</span></strong&...
Fuzzy Chaotic Neural Networks
Fuzzy Chaotic Neural Networks
An understanding of the human brain’s local function has improved in recent years. But the cognition of human brain’s working process as a whole is still obscure. Both fuzzy logic ...
PARAMETRIC STUDY OF WARREN STEEL TRUSS BRIDGE USING ARTIFICIAL NEURAL NETWORKS
PARAMETRIC STUDY OF WARREN STEEL TRUSS BRIDGE USING ARTIFICIAL NEURAL NETWORKS
Abstract   Steel truss bridges are a popular type of amongst several other standard bridges in Indonesia due to their lightweight yet robust and strong structure. In this study A...
The Geography of Cyberspace
The Geography of Cyberspace
The Virtual and the Physical The structure of virtual space is a product of the Internet’s geography and technology. Debates around the nature of the virtual — culture, s...
Relevance of network topology for the dynamics of biological neuronal networks
Relevance of network topology for the dynamics of biological neuronal networks
Complex random networks provide a powerful mathematical framework to study high-dimensional physical and biological systems. Several features of network structure (e.g. degree corr...
DEVELOPMENT OF AN ONTOLOGICAL MODEL OF DEEP LEARNING NEURAL NETWORKS
DEVELOPMENT OF AN ONTOLOGICAL MODEL OF DEEP LEARNING NEURAL NETWORKS
This research paper examines the challenges and prospects associated with the integration of artificial neural networks and knowledge bases. The focus is on leveraging ...
MASKS: A Multi-Artificial Neural Networks System's verification approach
MASKS: A Multi-Artificial Neural Networks System's verification approach
<div>Artificial Neural networks are one of the most widely applied approaches for classification problems. However, developing an errorless artificial neural network is in pr...
Integrating quantum neural networks with machine learning algorithms for optimizing healthcare diagnostics and treatment outcomes
Integrating quantum neural networks with machine learning algorithms for optimizing healthcare diagnostics and treatment outcomes
The rapid advancements in artificial intelligence (AI) and quantum computing have catalyzed an unprecedented shift in the methodologies utilized for healthcare diagnostics and trea...

Back to Top