Jinjing Zhou

ORCID: 0009-0003-9704-0112
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Advanced Graph Neural Networks
  • Graph Theory and Algorithms
  • Machine Learning in Materials Science
  • Ferroelectric and Negative Capacitance Devices
  • Complex Network Analysis Techniques
  • Computational Drug Discovery Methods
  • Recommender Systems and Techniques
  • Topic Modeling
  • Liver physiology and pathology
  • Parallel Computing and Optimization Techniques
  • Biofield Effects and Biophysics
  • Nutrition and Health in Aging
  • Schizophrenia research and treatment
  • Advanced Memory and Neural Computing
  • Cancer-related gene regulation
  • Cancer Research and Treatments
  • Adversarial Robustness in Machine Learning
  • Psychosomatic Disorders and Their Treatments

Shanghai Jiao Tong University
2022-2024

Shanghai Mental Health Center
2024

Ruijin Hospital
2022

Shanghai Artificial Intelligence Laboratory
2020-2021

Amazon (Germany)
2021

Children's Hospital of Chongqing Medical University
2012

Chongqing Medical University
2012

Advancing research in the emerging field of deep graph learning requires new tools to support tensor computation over graphs. In this paper, we present design principles and implementation Deep Graph Library (DGL). DGL distills computational patterns GNNs into a few generalized sparse operations suitable for extensive parallelization. By advocating as central programming abstraction, can perform optimizations transparently. cautiously adopting framework-neutral design, allows users easily...

10.48550/arxiv.1909.01315 preprint EN other-oa arXiv (Cornell University) 2019-01-01

Graph neural networks (GNNs) constitute a class of deep learning methods for graph data. They have wide applications in chemistry and biology, such as molecular property prediction, reaction drug-target interaction prediction. Despite the interest, GNN-based modeling is challenging it requires data preprocessing addition to programming learning. Here, we present Deep Library (DGL)-LifeSci, an open-source package on graphs life science. (DGL)-LifeSci python toolkit based RDKit, PyTorch,...

10.1021/acsomega.1c04017 article EN cc-by-nc-nd ACS Omega 2021-10-05

Graph neural networks (GNN) have shown great success in learning from graph-structured data. They are widely used various applications, such as recommendation, fraud detection, and search. In these domains, the graphs typically large, containing hundreds of millions nodes several billions edges. To tackle this challenge, we develop DistDGL, a system for training GNNs mini-batch fashion on cluster machines. DistDGL is based Deep Library (DGL), popular GNN development framework. distributes...

10.1109/ia351965.2020.00011 article EN 2020-11-01

While many systems have been developed to train graph neural networks (GNNs), efficient model evaluation, which computes node embedding according a given model, remains be addressed. For instance, using the widely adopted node-wise approach, evaluation can account for over 90% of time in end-to-end training process due neighbor explosion, means that accesses its multi-hop neighbors. The layer-wise approach avoids explosion by conducting computation layer GNN models. However, takes...

10.1145/3580305.3599805 article EN Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining 2023-08-04

Graph neural networks (GNN) have shown great success in learning from graph-structured data. They are widely used various applications, such as recommendation, fraud detection, and search. In these domains, the graphs typically large, containing hundreds of millions nodes several billions edges. To tackle this challenge, we develop DistDGL, a system for training GNNs mini-batch fashion on cluster machines. DistDGL is based Deep Library (DGL), popular GNN development framework. distributes...

10.48550/arxiv.2010.05337 preprint EN other-oa arXiv (Cornell University) 2020-01-01

Despite the recent success of graph neural networks (GNN), common architectures often exhibit significant limitations, including sensitivity to oversmoothing, long-range dependencies, and spurious edges, e.g., as can occur a result heterophily or adversarial attacks. To at least partially address these issues within simple transparent framework, we consider new family GNN layers designed mimic integrate update rules two classical iterative algorithms, namely, proximal gradient descent...

10.48550/arxiv.2103.06064 preprint EN other-oa arXiv (Cornell University) 2021-01-01

Cognitive impairment is common in chronic schizophrenia patients. The purpose of this study was to explore the efficacy Shen-based Qigong Exercise (SBQE) improving cognitive stable patients rehabilitation wards. SBQE derived from theory "body-spirit syncretism (xin shen he yi)" traditional Chinese medicine (TCM) and extracted four techniques. In 12-week, randomized, single-blind, controlled study, a total 40 were randomly assigned either group or control group. scores for Scale Assessment...

10.1186/s12888-024-06146-8 article EN cc-by-nc-nd BMC Psychiatry 2024-11-13

Sarcopenia and cognitive impairment are the most prevalent causes of disability in older individuals. The aim this study was to assess prevalence sarcopenia association between patients.A cross-sectional undertaken, comprised 250 male patients aged 65 over. defined using diagnostic recommended consensus by Asian Working Group for sarcopenia, participants were classified into non-sarcopenia groups according definition. functions assessed Mini-Mental State Examination (MMSE). After bivariate...

10.6133/apjcn.202209_31(3).0021 article EN PubMed 2022-01-01

While many systems have been developed to train Graph Neural Networks (GNNs), efficient model inference and evaluation remain be addressed. For instance, using the widely adopted node-wise approach, can account for up 94% of time in end-to-end training process due neighbor explosion, which means that a node accesses its multi-hop neighbors. On other hand, layer-wise avoids explosion problem by conducting layer such nodes only need their one-hop neighbors each layer. However, implementing...

10.48550/arxiv.2211.15082 preprint EN other-oa arXiv (Cornell University) 2022-01-01
Coming Soon ...