BERT4Bitter: a bidirectional encoder representations from transformers (BERT)-based model for improving the prediction of bitter peptides
Benchmarking
Robustness
Identification
n-gram
DOI:
10.1093/bioinformatics/btab133
Publication Date:
2021-02-24T12:23:03Z
AUTHORS (5)
ABSTRACT
The identification of bitter peptides through experimental approaches is an expensive and time-consuming endeavor. Due to the huge number newly available peptide sequences in post-genomic era, development automated computational models for novel highly desirable.In this work, we present BERT4Bitter, a bidirectional encoder representation from transformers (BERT)-based model predicting directly their amino acid sequence without using any structural information. To best our knowledge, first time BERT-based has been employed identify peptides. Compared widely used machine learning models, BERT4Bitter achieved performance with accuracy 0.861 0.922 cross-validation independent tests, respectively. Furthermore, extensive empirical benchmarking experiments on dataset demonstrated that clearly outperformed existing method improvements 8.0% 16.0% Matthews coefficient correlation, highlighting effectiveness robustness BERT4Bitter. We believe proposed herein will be useful tool rapidly screening identifying drug nutritional research.The user-friendly web server freely accessible at http://pmlab.pythonanywhere.com/BERT4Bitter.Supplementary data are Bioinformatics online.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (52)
CITATIONS (131)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....