- Cancer Cells and Metastasis
- Cancer-related molecular mechanisms research
- Software Engineering Research
- Mesenchymal stem cell research
- Immune cells in cancer
- vaccines and immunoinformatics approaches
- Topic Modeling
- Text and Document Classification Technologies
- Natural Language Processing Techniques
- MicroRNA in disease regulation
- Model-Driven Software Engineering Techniques
- Extracellular vesicles in disease
- Software Testing and Debugging Techniques
- Advanced Malware Detection Techniques
Peking University
2022-2024
Jiangsu University
2014
MicroRNAs (miRNAs) are involved in gastric cancer development and progression. However, the expression role of miRNAs stromal cells still unclear. The differentially expressed tissue-derived mesenchymal stem (GC-MSCs) relative to adjacent non-cancerous MSCs (GCN-MSCs) tissues were screened using miRNA microarray validated by quantitative RT–PCR. impact GC-MSCs on HGC-27 was observed vitro colony formation transwell assays, these subcutaneously co-injected into mice assess tumour growth vivo....
Emerging evidence indicate that mesenchymal stem cells (MSCs) affect tumor progression by reshaping the microenvironment. Neutrophils are essential component of microenvironment and critically involved in cancer progression. Whether phenotype function neutrophils is influenced MSCs not well understood. Herein, we investigated interaction between gastric cancer-derived (GC-MSCs) explored biological role this interaction. We found GC-MSCs induced chemotaxis protected them from spontaneous...
We present DeepSeek-Coder-V2, an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. Specifically, DeepSeek-Coder-V2 is further pre-trained from intermediate checkpoint of DeepSeek-V2 with additional 6 trillion tokens. Through this continued pre-training, substantially enhances the coding and mathematical reasoning capabilities DeepSeek-V2, while maintaining general Compared DeepSeek-Coder-33B, demonstrates...
Pretrained models for code have exhibited promising performance across various code-related tasks, such as summarization, completion, translation, and bug detection. However, despite their success, the majority of current still represent a token sequence, which may not adequately capture essence underlying structure.
Code pre-trained models have shown promising effectiveness in various software engineering tasks. Among these tasks, many tasks are related to evolution and/or code editing. However, existing often overlook the real-world editing data and evolutionary nature of process. In this paper, simulate step-by-step process human developers, we propose DivoT5, a model based on directional diffusion at level. adopt two categories pre-training The first category is mask denoising augmented with...
Word embedding has been widely used in various areas to boost the performance of neural models. However, when processing context-free languages, grammar rules with word loses two types information. One is structural relationship between rules, and other one content information rule definition. In this paper, we make first attempt learn a grammar-preserving embedding. We introduce novel graph structure represent grammar. Then, apply Graph Neural Network (GNN) extract use gating layer...