Word Embedding Methods for Word Representation in Deep Learning for Natural Language Processing

Word embedding
DOI: 10.24996/ijs.2022.63.3.37 Publication Date: 2022-04-05T12:18:26Z
ABSTRACT
Natural Language Processing (NLP) deals with analysing, understanding and generating languages likes human. One of the challenges NLP is training computers to understand way learning using a language as Every session consists several types sentences different context linguistic structures. Meaning sentence depends on actual meaning main words their correct positions. Same word can be used noun or adjective others based position. In NLP, Word Embedding powerful method which trained large collection texts encoded general semantic syntactic information words. Choosing right embedding generates more efficient result than others. Most papers pretrained vector in deep for processing. But, major issue that it can’t use all this paper, local formation process have been proposed shown comparison between vectors Bengali language. The Keras framework Python implementation analysis section paper shows model produced 87.84% accuracy better fastText 86.75%. Using researchers easily build specific representation Processing.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (23)