Improving Graph Attention Networks with Large Margin-based Constraints
Smoothing
Margin (machine learning)
Benchmark (surveying)
Feature (linguistics)
Feature Learning
Representation
DOI:
10.48550/arxiv.1910.11945
Publication Date:
2019-01-01
AUTHORS (4)
ABSTRACT
Graph Attention Networks (GATs) are the state-of-the-art neural architecture for representation learning with graphs. GATs learn attention functions that assign weights to nodes so different have influences in feature aggregation steps. In practice, however, induced prone over-fitting due increasing number of parameters and lack direct supervision on weights. also suffer from over-smoothing at decision boundary nodes. Here we propose a framework address their weaknesses via margin-based constraints during training. We first theoretically demonstrate behavior then develop an approach using constraint according class pattern. Furthermore, alleviate problem, additional graph structure. Extensive experiments ablation studies common benchmark datasets effectiveness our method, which leads significant improvements over previous methods all datasets.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....