- Particle physics theoretical and experimental studies
- Quantum Chromodynamics and Particle Interactions
- High-Energy Particle Collisions Research
- Advanced Graph Neural Networks
- Computational Physics and Python Applications
- Explainable Artificial Intelligence (XAI)
- Complex Network Analysis Techniques
- Topic Modeling
- Adversarial Robustness in Machine Learning
- Black Holes and Theoretical Physics
- Neutrino Physics Research
- Machine Learning in Materials Science
- Domain Adaptation and Few-Shot Learning
- Recommender Systems and Techniques
- Bioinformatics and Genomic Networks
- advanced mathematical theories
- Imbalanced Data Classification Techniques
- Text and Document Classification Technologies
- Privacy-Preserving Technologies in Data
- Stochastic processes and statistical mechanics
- Time Series Analysis and Forecasting
- Advanced Clustering Algorithms Research
- Advanced Computing and Algorithms
- Machine Learning in Healthcare
- Particle Accelerators and Free-Electron Lasers
Pennsylvania State University
2017-2025
North China Electric Power University
2021-2024
University of North Carolina at Charlotte
2024
Southeast University
2021
Google (United States)
2021
Harvard University
2020
Universidad del Noreste
2020
Southeast University
2013
Case Western Reserve University
2011
Graph Neural Networks (GNNs) have shown to be powerful tools for graph analytics. The key idea is recursively propagate and aggregate information along the edges of given graph. Despite their success, however, existing GNNs are usually sensitive quality input Real-world graphs often noisy contain task-irrelevant edges, which may lead suboptimal generalization performance in learned GNN models. In this paper, we propose PTDNet, a parameterized topological denoising network, improve robustness...
Contrastive learning has been widely applied to graph representation learning, where the view generators play a vital role in generating effective contrastive samples. Most of existing methods employ pre-defined generation methods, e.g., node drop or edge perturbation, which usually cannot adapt input data preserve original semantic structures well. To address this issue, we propose novel framework named Automated Graph Learning (AutoGCL) paper. Specifically, AutoGCL employs set learnable...
Various graph contrastive learning models have been proposed to improve the performance of tasks on datasets in recent years. While effective and prevalent, these are usually carefully customized. In particular, although all researches create two views, they differ greatly view augmentations, architectures, objectives. It remains an open question how build your model from scratch for particular datasets. this work, we aim fill gap by studying information is transformed transferred during...
Graph Neural Networks (GNNs) have achieved promising results in various tasks such as node classification and graph classification. Recent studies find that GNNs are vulnerable to adversarial attacks. However, effective backdoor attacks on graphs still an open problem. In particular, attack poisons the by attaching triggers target class label a set of nodes training graph. The backdoored trained poisoned will then be misled predict test once attached with triggers. Though there some initial...
Node classification in graph-structured data aims to classify the nodes where labels are only available for a subset of nodes. This problem has attracted considerable research efforts recent years. In real-world applications, both graph topology and node attributes evolve over time. Existing techniques, however, mainly focus on static graphs lack capability simultaneously learn temporal spatial/structural features. attributed is challenging two major aspects. First, effectively modeling...
Network embedding aims to learn a low-dimensional vector representation for each node in the social and information networks, with constraint preserve network structures. Most existing methods focus on single embedding, ignoring relationship between multiple networks. In many real-world applications, however, networks may contain complementary information, which can lead further refined embeddings. Thus, this paper, we propose novel multi-network method, DMNE. DMNE is flexible. It allows...
Node classification is an important research topic in graph learning. Graph neural networks (GNNs) have achieved state-of-the-art performance of node classification. However, existing GNNs address the problem where samples for different classes are balanced; while many real-world scenarios, some may much fewer instances than others. Directly training a GNN classifier this case would under-represent from those minority and result sub-optimal performance. Therefore, it very to develop...
Edges in real-world graphs are typically formed by a variety of factors and carry diverse relation semantics. For example, connections social network could indicate friendship, being colleagues, or living the same neighborhood. However, these latent usually concealed behind mere edge existence due to data collection graph formation processes. Despite rapid developments learning over years, most models take holistic approach treat all edges as equal. One major difficulty disentangling is lack...
Deep neural networks (DNNs) have shown superior performances on various multimodal learning problems. However, it often requires huge efforts to adapt DNNs individual tasks by manually engineering unimodal features and designing feature fusion strategies. This paper proposes Bilevel Multimodal Neural Architecture Search (BM-NAS) framework, which makes the architecture of models fully searchable via a bilevel searching scheme. At upper level, BM-NAS selects inter/intra-modal pairs from...
Despite recent progress in Graph Neural Networks (GNNs), explaining predictions made by GNNs remains a challenging and nascent problem. The leading method mainly considers the local explanations, i.e., important subgraph structure node features, to interpret why GNN model makes prediction for single instance, e.g. or graph. As result, explanation generated is painstakingly customized at instance level. unique interpreting each independently not sufficient provide global understanding of...
Graph Neural Networks (GNNs) resurge as a trending research subject owing to their impressive ability capture representations from graph-structured data. However, the black-box nature of GNNs presents significant challenge in terms comprehending and trusting these models, thereby limiting practical applications mission-critical scenarios. Although there has been substantial progress field explaining recent years, majority studies are centered on static graphs, leaving explanation dynamic...
The problem of learning and forecasting underlying trends in time series data arises a variety applications, such as traffic management, energy optimization, etc. In literature, trend is characterized by the slope duration, its prediction then to forecast two values subsequent given historical series. For this problem, existing approaches mainly deal with case univariate However, many real-world there are multiple variables at play, handling all them same crucial for an accurate prediction....
Given a network with the labels for subset of nodes, transductive node classification targets to predict remaining nodes in network. This technique has been used variety applications such as voxel functionality detection brain and group label prediction social Most existing approaches are performed static networks. However, many real-world networks dynamic evolve over time. The dynamics both attributes topology jointly determine labels. In this paper, we study problem classifying task is...
Uncovering rationales behind predictions of graph neural networks (GNNs) has received increasing attention over recent years. Instance-level GNN explanation aims to discover critical input elements, like nodes or edges, that the target relies upon for making predictions. Though various algorithms are proposed, most them formalize this task by searching minimal subgraph which can preserve original However, an inductive bias is deep-rooted in framework: several subgraphs result same similar...
Graph Neural Networks (GNNs) have demonstrated significant success in learning from graph-structured data across various domains. Despite their great successful, one critical challenge is often overlooked by existing works, i.e., the of message propagation that can generalize effectively to underrepresented graph regions. These minority regions exhibit irregular homophily/heterophily patterns and diverse neighborhood class distributions, resulting ambiguity. In this work, we investigate...
Joint clustering of multiple networks has been shown to be more accurate than performing on individual separately. This is because multi-network algorithms typically assume there a common structure shared by all networks, and different can provide compatible complementary information for uncovering this underlying structure. However, assumption too strict hold in many emerging applications, where usually have diverse data distributions. More popularly, the consideration belong groups. Only...
Network modeling aims to learn the latent representations of nodes such that preserve both network structures and node attribute information. This problem is fundamental due its prevalence in numerous domains. However, existing approaches either target static networks or struggle capture complicated temporal dependency, while most real-world evolve over time success hinges on understanding how entities are temporally connected. In this paper, we present TRRN, a transformer-style relational...
Graph neural networks (GNNs) have achieved great success in various graph problems. However, most GNNs are Message Passing Neural Networks (MPNNs) based on the homophily assumption, where nodes with same label connected graphs. Real-world problems bring us heterophily problems, different labels MPNNs fail to address problem because they mix information from distributions and not good at capturing global patterns. Therefore, we investigate a novel Memory model Heterophilous Graphs (HP-GMN)...
Uncovering rationales behind predictions of graph neural networks (GNNs) has received increasing attention over recent years. Instance-level GNN explanation aims to discover critical input elements, such as nodes or edges, that the target relies upon for making predictions. Though various algorithms are proposed, most them formalize this task by searching minimal subgraph, which can preserve original However, an inductive bias is deep-rooted in framework: Several subgraphs result same...
Security of deep neural network (DNN) inference engines, i.e., trained DNN models on various platforms, has become one the biggest challenges in deploying artificial intelligence domains where privacy, safety, and reliability are paramount importance, such as medical applications. In addition to classic software attacks model inversion evasion attacks, recently a new attack surface---implementation which include both passive side-channel active fault injection adversarial attacks---is...
Imitation learning, which learns agent policy by mimicking expert demonstration, has shown promising results in many applications such as medical treatment regimes and self-driving vehicles. However, it remains a difficult task to interpret control policies learned the agent. Difficulties mainly come from two aspects: 1) agents imitation learning are usually implemented deep neural networks, black-box models lack interpretability; 2) latent causal mechanism behind agents' decisions may vary...
Automating medical diagnosis is an important data mining problem, which to infer likely disease(s) for some observed symptoms. Algorithms the problem are very beneficial as a supplement real diagnosis. Existing methods typically perform inference on sparse bipartite graph with two sets of nodes representing diseases and symptoms, respectively. By using this graph, existing basically assume no direct dependency exists between (or symptoms), may not be true in reality. To address limitation,...