Qihao Zhu

ORCID: 0009-0008-5155-451X
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Cancer Cells and Metastasis
  • Cancer-related molecular mechanisms research
  • Software Engineering Research
  • Mesenchymal stem cell research
  • Immune cells in cancer
  • vaccines and immunoinformatics approaches
  • Topic Modeling
  • Text and Document Classification Technologies
  • Natural Language Processing Techniques
  • MicroRNA in disease regulation
  • Model-Driven Software Engineering Techniques
  • Extracellular vesicles in disease
  • Software Testing and Debugging Techniques
  • Advanced Malware Detection Techniques

Peking University
2022-2024

Jiangsu University
2014

MicroRNAs (miRNAs) are involved in gastric cancer development and progression. However, the expression role of miRNAs stromal cells still unclear. The differentially expressed tissue-derived mesenchymal stem (GC-MSCs) relative to adjacent non-cancerous MSCs (GCN-MSCs) tissues were screened using miRNA microarray validated by quantitative RT–PCR. impact GC-MSCs on HGC-27 was observed vitro colony formation transwell assays, these subcutaneously co-injected into mice assess tumour growth vivo....

10.1038/bjc.2014.14 article EN cc-by-nc-sa British Journal of Cancer 2014-01-28

Emerging evidence indicate that mesenchymal stem cells (MSCs) affect tumor progression by reshaping the microenvironment. Neutrophils are essential component of microenvironment and critically involved in cancer progression. Whether phenotype function neutrophils is influenced MSCs not well understood. Herein, we investigated interaction between gastric cancer-derived (GC-MSCs) explored biological role this interaction. We found GC-MSCs induced chemotaxis protected them from spontaneous...

10.1038/cddis.2014.263 article EN cc-by-nc-nd Cell Death and Disease 2014-06-19

We present DeepSeek-Coder-V2, an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. Specifically, DeepSeek-Coder-V2 is further pre-trained from intermediate checkpoint of DeepSeek-V2 with additional 6 trillion tokens. Through this continued pre-training, substantially enhances the coding and mathematical reasoning capabilities DeepSeek-V2, while maintaining general Compared DeepSeek-Coder-33B, demonstrates...

10.48550/arxiv.2406.11931 preprint EN arXiv (Cornell University) 2024-06-17

Pretrained models for code have exhibited promising performance across various code-related tasks, such as summarization, completion, translation, and bug detection. However, despite their success, the majority of current still represent a token sequence, which may not adequately capture essence underlying structure.

10.1145/3597503.3639125 article EN 2024-04-12

Code pre-trained models have shown promising effectiveness in various software engineering tasks. Among these tasks, many tasks are related to evolution and/or code editing. However, existing often overlook the real-world editing data and evolutionary nature of process. In this paper, simulate step-by-step process human developers, we propose DivoT5, a model based on directional diffusion at level. adopt two categories pre-training The first category is mask denoising augmented with...

10.48550/arxiv.2501.12079 preprint EN arXiv (Cornell University) 2025-01-21

Word embedding has been widely used in various areas to boost the performance of neural models. However, when processing context-free languages, grammar rules with word loses two types information. One is structural relationship between rules, and other one content information rule definition. In this paper, we make first attempt learn a grammar-preserving embedding. We introduce novel graph structure represent grammar. Then, apply Graph Neural Network (GNN) extract use gating layer...

10.24963/ijcai.2022/631 article EN Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence 2022-07-01
Coming Soon ...