Understanding Heterophily for Graph Neural Networks

Degree (music) Infinity
DOI: 10.48550/arxiv.2401.09125 Publication Date: 2024-01-01
ABSTRACT
Graphs with heterophily have been regarded as challenging scenarios for Graph Neural Networks (GNNs), where nodes are connected dissimilar neighbors through various patterns. In this paper, we present theoretical understandings of the impacts different patterns GNNs by incorporating graph convolution (GC) operations into fully networks via proposed Heterophilous Stochastic Block Models (HSBM), a general random model that can accommodate diverse Firstly, show applying GC operation, separability gains determined two factors, i.e., Euclidean distance neighborhood distributions and $\sqrt{\mathbb{E}\left[\operatorname{deg}\right]}$, $\mathbb{E}\left[\operatorname{deg}\right]$ is averaged node degree. It reveals impact on classification needs to be evaluated alongside Secondly, topological noise has detrimental separability, which equivalent degrading $\mathbb{E}\left[\operatorname{deg}\right]$. Finally, when multiple operations, normalized $l$-powered distributions. indicates still possess $l$ goes infinity in wide range regimes. Extensive experiments both synthetic real-world data verify effectiveness our theory.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....