Lam D. Chau

ORCID: 0000-0002-1215-087X
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Biomedical Text Mining and Ontologies
  • Natural Language Processing Techniques
  • Topic Modeling
  • Galectins and Cancer Biology
  • Glycosylation and Glycoproteins Research
  • Carbohydrate Chemistry and Synthesis

Case Western Reserve University
2022-2025

Abstract Mucin type O-glycan core elongation is typically performed by the C1GALT1, B3GNT6, and ST6GalNAc-I/-II O-glycosyltransferases. These enzymes target Tn antigen (GalNAc-O-Thr/Ser) dictating fate of elongation, playing important roles in health disease. Changes transferase expression glycan structure are commonly associated with diseases such as cancer, Tn-syndrome, ulcerative colitis. Despite their significance, substrate specificities biological remain elusive. Here, we examine...

10.1093/glycob/cwaf014 article EN Glycobiology 2025-03-10

Long Phan, Tai Dang, Hieu Tran, Trieu H. Trinh, Vy Lam D. Chau, Minh-Thang Luong. Proceedings of the 17th Conference European Chapter Association for Computational Linguistics. 2023.

10.18653/v1/2023.eacl-main.228 article EN cc-by 2023-01-01

Abstract Biomedical data and benchmarks are highly valuable yet very limited in low-resource languages other than English such as Vietnamese. In this paper, we make use of a state-of-theart translation model English-Vietnamese to translate produce both pretrained well supervised the biomedical domains. Thanks large-scale translation, introduce ViPubmedT5, Encoder-Decoder Transformer trained on 20 million translated abstracts from high-quality public PubMed corpus. ViPubMedT5 demonstrates...

10.1101/2022.10.11.511776 preprint EN cc-by-nc-nd bioRxiv (Cold Spring Harbor Laboratory) 2022-10-14

Biomedical data and benchmarks are highly valuable yet very limited in low-resource languages other than English such as Vietnamese. In this paper, we make use of a state-of-the-art translation model English-Vietnamese to translate produce both pretrained well supervised the biomedical domains. Thanks large-scale translation, introduce ViPubmedT5, Encoder-Decoder Transformer trained on 20 million translated abstracts from high-quality public PubMed corpus. ViPubMedT5 demonstrates results two...

10.48550/arxiv.2210.05598 preprint EN cc-by arXiv (Cornell University) 2022-01-01
Coming Soon ...