Javascript must be enabled to continue!
Addressing the Limitations of Graph Neural Networks on Node-level Tasks
View through CrossRef
As a generic data structure, graph is capable of modeling complex relations among objects in many real-world problems. Integrated with deep learning and graph signal processing, Graph Neural Network (GNN) has achieved significant progress for solving large, complex, graph-structured problems in recent decade. GNNs extend basic Neural Network (NN) by incorporating graph structures grounded on the relational inductive bias and have been commonly believed to outperform NNs in real-world tasks. Despite their efficacy, the development of deep and shallow GNNs is confronting two main challenges,• Limited expressive power of deep GNNs: Since graph convolution can be considered as a special form of Laplacian smoothing, stacking multiple GNN layers like the way as deep NNs can lead to an over-smoothing issue, where distant nodes become less identifiable and hard to be discriminated;• Performance degradation of shallow GNNs on heterophilic graphs: When the homophily principle is absent and nodes from different classes are more likely to be connected, the representation of nodes from distinct classes will be erroneously blending, leading nodes to be indistinguishable.In this dissertation, we will delve into these two obstacles in depth, analyzing themthoroughly and proposing methods to address them efficiently.
Title: Addressing the Limitations of Graph Neural Networks on Node-level Tasks
Description:
As a generic data structure, graph is capable of modeling complex relations among objects in many real-world problems.
Integrated with deep learning and graph signal processing, Graph Neural Network (GNN) has achieved significant progress for solving large, complex, graph-structured problems in recent decade.
GNNs extend basic Neural Network (NN) by incorporating graph structures grounded on the relational inductive bias and have been commonly believed to outperform NNs in real-world tasks.
Despite their efficacy, the development of deep and shallow GNNs is confronting two main challenges,• Limited expressive power of deep GNNs: Since graph convolution can be considered as a special form of Laplacian smoothing, stacking multiple GNN layers like the way as deep NNs can lead to an over-smoothing issue, where distant nodes become less identifiable and hard to be discriminated;• Performance degradation of shallow GNNs on heterophilic graphs: When the homophily principle is absent and nodes from different classes are more likely to be connected, the representation of nodes from distinct classes will be erroneously blending, leading nodes to be indistinguishable.
In this dissertation, we will delve into these two obstacles in depth, analyzing themthoroughly and proposing methods to address them efficiently.
Related Results
Twilight graphs
Twilight graphs
AbstractThis paper deals primarily with countable, simple, connected graphs and the following two conditions which are trivially satisfied if the graphs are finite:(a) there is an ...
Burning, edge burning & chromatic burning classification of some graph family
Burning, edge burning & chromatic burning classification of some graph family
Graph ‘G’ is a Simple and undirected graph, which has a lowest number of color that is required to color the edge is called chromatic index. It is denoted by the symbol χ1(G). In ...
Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks
Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory...
Query driven-graph neural networks for community search
Query driven-graph neural networks for community search
Given one or more query vertices, Community Search (CS) aims to find densely intra-connected and loosely inter-connected structures containing query vertices. Attributed Community ...
Noise Robust Projection Rule for Klein Hopfield Neural Networks
Noise Robust Projection Rule for Klein Hopfield Neural Networks
Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural net...
Image Patch comparison using Convolutional Neural Networks
Image Patch comparison using Convolutional Neural Networks
To encode such a function, we opt for a CNN-based model that is trained to account for a wide variety of changes in image appearance. To that end, we explore and study multiple neu...
Synaptic Dynamics in Analog VLSI
Synaptic Dynamics in Analog VLSI
Synapses are crucial elements for computation and information transfer in both real and artificial neural systems. Recent experimental findings and theoretical models of pulse-base...
Redundancy-Aware Pruning of Convolutional Neural Networks
Redundancy-Aware Pruning of Convolutional Neural Networks
Pruning is an effective way to slim and speed up convolutional neural networks. Generally previous work directly pruned neural networks in the original feature space without consid...
Recent Results
Fertilization Narratives in the Art of Gustav Klimt, Diego Rivera and Frida Kahlo: Repression, Domination and Eros among Cells
Fertilization Narratives in the Art of Gustav Klimt, Diego Rivera and Frida Kahlo: Repression, Domination and Eros among Cells
Fertilization narratives are powerful biological stories that can be used for social ends, and 20th-century artists have used fertilization-based imagery to convey political and so...