A Grammar-Based Structural CNN Decoder for Code Generation

Executable Robustness Benchmark (surveying)
DOI: 10.1609/aaai.v33i01.33017055 Publication Date: 2019-08-25T07:36:24Z
ABSTRACT
Code generation maps a program description to executable source code in programming language. Existing approaches mainly rely on recurrent neural network (RNN) as the decoder. However, we find that contains significantly more tokens than natural language sentence, and thus it may be inappropriate for RNN capture such long sequence. In this paper, propose grammar-based structural convolutional (CNN) generation. Our model generates by predicting grammar rules of language; design several CNN modules, including tree-based convolution pre-order convolution, whose information is further aggregated dedicated attentive pooling layers. Experimental results HearthStone benchmark dataset show our generator outperforms previous state-of-the-art method 5 percentage points; additional experiments semantic parsing tasks demonstrate robustness model. We also conduct in-depth ablation test better understand each component
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (69)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....