Measuring Task Similarity and Its Implication in Fine-Tuning Graph Neural Networks
Similarity (geometry)
Deep Neural Networks
DOI:
10.1609/aaai.v38i11.29156
Publication Date:
2024-03-25T11:01:39Z
AUTHORS (7)
ABSTRACT
The paradigm of pre-training and fine-tuning graph neural networks has attracted wide research attention. In previous studies, the pre-trained models are viewed as universally versatile, applied for a diverse range downstream tasks. many situations, however, this practice results in limited or even negative transfer. This paper, first time, emphasizes specific application scope models: not all tasks can effectively benefit from model. light this, we introduce measure task consistency to quantify similarity between assesses extent which Moreover, novel strategy, Bridge-Tune, is proposed further diminish impact difference key innovation Bridge-Tune an intermediate step that bridges takes into account differences refines superiority presented strategy validated via numerous experiments with different
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (4)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....