Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Error Bounds for a Matrix-Vector Product Approximation with Deep ReLU Neural Networks

View through CrossRef
Abstract—Inspired by the depth and breadth of developments on the theory of deep learning, we pose these fundamental questions: can we accurately approximate an arbitrary matrix-vector product using deep rectified linear unit (ReLU) feedforward neural networks (FNNs)? If so, can we bound the resulting approximation error? Attempting to answer these questions, we derive error bounds in Lebesgue and Sobolev norms for a matrix-vector product approximation with deep ReLU FNNs. Since a matrix-vector product models several problems in wireless communications and signal processing; network science and graph signal processing; and network neuroscience and brain physics, we discuss various applications that are motivated by an accurate matrix-vector product approximation with deep ReLU FNNs. Toward this end, the derived error bounds offer a theoretical insight and guarantee in the development of algorithms based on deep ReLU FNNs. <br>
Institute of Electrical and Electronics Engineers (IEEE)
Title: Error Bounds for a Matrix-Vector Product Approximation with Deep ReLU Neural Networks
Description:
Abstract—Inspired by the depth and breadth of developments on the theory of deep learning, we pose these fundamental questions: can we accurately approximate an arbitrary matrix-vector product using deep rectified linear unit (ReLU) feedforward neural networks (FNNs)? If so, can we bound the resulting approximation error? Attempting to answer these questions, we derive error bounds in Lebesgue and Sobolev norms for a matrix-vector product approximation with deep ReLU FNNs.
Since a matrix-vector product models several problems in wireless communications and signal processing; network science and graph signal processing; and network neuroscience and brain physics, we discuss various applications that are motivated by an accurate matrix-vector product approximation with deep ReLU FNNs.
Toward this end, the derived error bounds offer a theoretical insight and guarantee in the development of algorithms based on deep ReLU FNNs.
<br>.

Related Results

Error Bounds for a Matrix-Vector Product Approximation with Deep ReLU Neural Networks
Error Bounds for a Matrix-Vector Product Approximation with Deep ReLU Neural Networks
Abstract—Inspired by the depth and breadth of developments on the theory of deep learning, we pose these fundamental questions: can we accurately approximate an arbitrary matrix-ve...
Learning Theory and Approximation
Learning Theory and Approximation
The workshop Learning Theory and Approximation , organised by Kurt Jetter (Stuttgart-Hohenheim), Steve Smale (Berkeley) and Ding-Xuan Zhou (...
Deep convolutional neural network and IoT technology for healthcare
Deep convolutional neural network and IoT technology for healthcare
Background Deep Learning is an AI technology that trains computers to analyze data in an approach similar to the human brain. Deep learning algorithms can find ...
Fuzzy Chaotic Neural Networks
Fuzzy Chaotic Neural Networks
An understanding of the human brain’s local function has improved in recent years. But the cognition of human brain’s working process as a whole is still obscure. Both fuzzy logic ...
On the role of network dynamics for information processing in artificial and biological neural networks
On the role of network dynamics for information processing in artificial and biological neural networks
Understanding how interactions in complex systems give rise to various collective behaviours has been of interest for researchers across a wide range of fields. However, despite ma...
[RETRACTED] Keanu Reeves CBD Gummies v1
[RETRACTED] Keanu Reeves CBD Gummies v1
[RETRACTED]Keanu Reeves CBD Gummies ==❱❱ Huge Discounts:[HURRY UP ] Absolute Keanu Reeves CBD Gummies (Available)Order Online Only!! ❰❰= https://www.facebook.com/Keanu-Reeves-CBD-G...
Selection of Injectable Drug Product Composition using Machine Learning Models (Preprint)
Selection of Injectable Drug Product Composition using Machine Learning Models (Preprint)
BACKGROUND As of July 2020, a Web of Science search of “machine learning (ML)” nested within the search of “pharmacokinetics or pharmacodynamics” yielded over 100...
Characterizing Parameter Equivalence and Approximation in 1D FEM and ReLU NNs
Characterizing Parameter Equivalence and Approximation in 1D FEM and ReLU NNs
Abstract This paper introduces a novel methodology for characterizing parameter equivalence between Rectified Linear Unit (ReLU) Neural Networks (NNs) and 1D Finite Element...

Back to Top