Javascript must be enabled to continue!
T-HyperGNNs: Hypergraph Neural Networks Via Tensor Representations
View through CrossRef
<p>Hypergraph neural networks (HyperGNNs) are a family of deep neural networks designed to perform inference on hypergraphs. HyperGNNs follow either a spectral or a spatial approach, in which a convolution or message-passing operation is conducted based on a hypergraph algebraic descriptor. While many HyperGNNs have been proposed and achieved state-of- the-art performance on broad applications, there have been limited attempts at exploring high dimensional hypergraph descriptors (tensors) and joint node interactions carried by hyperedges. In this paper, we depart from hypergraph matrix representations and present a new tensor-HyperGNN framework (T-HyperGNN) with cross-node interactions. The T-HyperGNN framework consists of T-spectral convolution, T-spatial convo- lution, and T-message-passing HyperGNNs (T-MPHN). The T- spectral convolution HyperGNN is defined under the t-product algebra that closely connects to the spectral space. To im- prove computational efficiency for large hypergraphs, we localize the T-spectral convolution approach to formulate the T-spatial convolution and further devise a novel tensor message-passing algorithm for practical implementation by studying a compressed adjacency tensor representation. Compared to the state-of-the- art approaches, our T-HyperGNNs preserve intrinsic high-order network structures without any hypergraph reduction and model the joint effects of nodes through a cross-node interaction layer. These advantages of our T-HyperGNNs are demonstrated in a wide range of real-world hypergraph datasets. </p>
Institute of Electrical and Electronics Engineers (IEEE)
Title: T-HyperGNNs: Hypergraph Neural Networks Via Tensor Representations
Description:
<p>Hypergraph neural networks (HyperGNNs) are a family of deep neural networks designed to perform inference on hypergraphs.
HyperGNNs follow either a spectral or a spatial approach, in which a convolution or message-passing operation is conducted based on a hypergraph algebraic descriptor.
While many HyperGNNs have been proposed and achieved state-of- the-art performance on broad applications, there have been limited attempts at exploring high dimensional hypergraph descriptors (tensors) and joint node interactions carried by hyperedges.
In this paper, we depart from hypergraph matrix representations and present a new tensor-HyperGNN framework (T-HyperGNN) with cross-node interactions.
The T-HyperGNN framework consists of T-spectral convolution, T-spatial convo- lution, and T-message-passing HyperGNNs (T-MPHN).
The T- spectral convolution HyperGNN is defined under the t-product algebra that closely connects to the spectral space.
To im- prove computational efficiency for large hypergraphs, we localize the T-spectral convolution approach to formulate the T-spatial convolution and further devise a novel tensor message-passing algorithm for practical implementation by studying a compressed adjacency tensor representation.
Compared to the state-of-the- art approaches, our T-HyperGNNs preserve intrinsic high-order network structures without any hypergraph reduction and model the joint effects of nodes through a cross-node interaction layer.
These advantages of our T-HyperGNNs are demonstrated in a wide range of real-world hypergraph datasets.
</p>.
Related Results
T-HyperGNNs: Hypergraph Neural Networks Via Tensor Representations
T-HyperGNNs: Hypergraph Neural Networks Via Tensor Representations
<p>Hypergraph neural networks (HyperGNNs) are a family of deep neural networks designed to perform inference on hypergraphs. HyperGNNs follow either a spectral or a spatial a...
Theoretical Foundations and Practical Applications in Signal Processing and Machine Learning
Theoretical Foundations and Practical Applications in Signal Processing and Machine Learning
Tensor decomposition has emerged as a powerful mathematical framework for analyzing multi-dimensional data, extending classical matrix decomposition techniques to higher-order repr...
Enhanced inherent strain modelling for powder-based metal additive manufacturing
Enhanced inherent strain modelling for powder-based metal additive manufacturing
(English) Metal additive manufacturing (MAM), particularly powder bed fusion using a laser beam (PBF-LB), has transformed manufacturing by enabling the production of intricate and ...
Meta-Representations as Representations of Processes
Meta-Representations as Representations of Processes
In this study, we explore how the notion of meta-representations in Higher-Order Theories (HOT) of consciousness can be implemented in computational models. HOT suggests that consc...
Gravitational Waves from Alena Tensor
Gravitational Waves from Alena Tensor
Alena Tensor is a recently discovered class of energy-momentum tensors that proposes a general equivalence of the curved path and the geodesic for the analyzed spacetimes which all...
Fuzzy Chaotic Neural Networks
Fuzzy Chaotic Neural Networks
An understanding of the human brain’s local function has improved in recent years. But the cognition of human brain’s working process as a whole is still obscure. Both fuzzy logic ...
On the role of network dynamics for information processing in artificial and biological neural networks
On the role of network dynamics for information processing in artificial and biological neural networks
Understanding how interactions in complex systems give rise to various collective behaviours has been of interest for researchers across a wide range of fields. However, despite ma...
Hypergraph partitioning using tensor eigenvalue decomposition
Hypergraph partitioning using tensor eigenvalue decomposition
Hypergraphs have gained increasing attention in the machine learning community lately due to their superiority over graphs in capturingsuper-dyadicinteractions among entities. In t...

