SCGG: A deep structure-conditioned graph generative model

Generative model Substructure
DOI: 10.1371/journal.pone.0277887 Publication Date: 2022-11-21T18:41:06Z
ABSTRACT
Deep learning-based graph generation approaches have remarkable capacities for data modeling, allowing them to solve a wide range of real-world problems. Making these methods able consider different conditions during the procedure even increases their effectiveness by empowering generate new samples that meet desired criteria. This paper presents conditional deep method called SCGG considers particular type structural conditions. Specifically, our proposed model takes an initial subgraph and autoregressively generates nodes corresponding edges on top given conditioning substructure. The architecture consists representation learning network autoregressive generative model, which is trained end-to-end. More precisely, designed compute continuous representations each node in graph, are not only affected features adjacent nodes, but also ones farther nodes. primarily responsible providing with condition, while mainly maintains history. Using this we can address completion, rampant inherently difficult problem recovering missing associated partially observed graphs. computational complexity shown be linear number Experimental results both synthetic datasets demonstrate superiority compared state-of-the-art baselines.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (69)
CITATIONS (2)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....