Watch Your Step: Learning Node Embeddings via Graph Attention

Social and Information Networks (cs.SI) FOS: Computer and information sciences Computer Science - Machine Learning Statistics - Machine Learning 0202 electrical engineering, electronic engineering, information engineering Computer Science - Social and Information Networks Machine Learning (stat.ML) 02 engineering and technology Machine Learning (cs.LG)
DOI: 10.48550/arxiv.1710.09599 Publication Date: 2017-01-01
ABSTRACT
Graph embedding methods represent nodes in a continuous vector space, preserving information from the graph (e.g. by sampling random walks). There are many hyper-parameters to these (such as walk length) which have be manually tuned for every graph. In this paper, we replace with trainable parameters that automatically learn via backpropagation. particular, novel attention model on power series of transition matrix, guides optimize an upstream objective. Unlike previous approaches models, method propose utilizes exclusively data walk), and not used inference. We experiment link prediction tasks, aim produce embeddings best-preserve structure, generalizing unseen information. improve state-of-the-art comprehensive suite real world datasets including social, collaboration, biological networks. Adding walks can reduce error 20% 45% attempted. Further, our learned different graph, automatically-found values agree optimal choice hyper-parameter if tune existing methods.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....