Deep Semantics-Enhanced Neural Code Search
Code (set theory)
DOI:
10.3390/electronics13234704
Publication Date:
2024-11-28T12:28:01Z
AUTHORS (6)
ABSTRACT
Code search uses natural language queries to retrieve code snippets from a vast database, identifying those that are semantically similar the query. This enables developers reuse and enhance software development efficiency. Most existing algorithms focus on capturing semantic structural features by learning both text graph structures. However, these often struggle capture deeper within sources, leading lower accuracy in results. To address this issue, paper proposes novel semantics-enhanced neural algorithm called SENCS, which employs serialization two-stage attention mechanism. First, program dependency is transformed into unique serialized encoding, bidirectional long short-term memory (LSTM) model used learn information of sequence generate vectors rich features. Second, mechanism enhances embedded assigning different weight various during feature fusion phase, significant sequences, resulting information. validate performance proposed algorithm, extensive experiments were conducted two widely datasets, CodeSearchNet JavaNet. The experimental results show SENCS improves average metrics 8.30 % (MRR) 17.85% (DCG) compared best baseline literature, with an improvement 14.86% SR@1 metric. Experiments open-source datasets demonstrate achieves better effect than state of-the-art models.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (30)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....