- Machine Learning and Data Classification
- Text and Document Classification Technologies
- Advanced Combustion Engine Technologies
- Combustion and flame dynamics
- Machine Learning and Algorithms
- Domain Adaptation and Few-Shot Learning
- Advanced Image and Video Retrieval Techniques
- Image Retrieval and Classification Techniques
- Advanced Graph Neural Networks
- Topic Modeling
- Imbalanced Data Classification Techniques
- Recommender Systems and Techniques
- Anomaly Detection Techniques and Applications
- Music and Audio Processing
- Multimodal Machine Learning Applications
- Natural Language Processing Techniques
- Vehicle emissions and performance
- Heat transfer and supercritical fluids
- Adversarial Robustness in Machine Learning
- Advanced Neural Network Applications
- Biodiesel Production and Applications
- Combustion and Detonation Processes
- Catalytic Processes in Materials Science
- Machine Learning in Bioinformatics
- Chemical Thermodynamics and Molecular Structure
Nanyang Technological University
2018-2025
Singapore University of Technology and Design
2024
China Electronics Technology Group Corporation
2022-2024
Tianjin University
2017-2022
Chongqing University
2020-2022
Chinese Academy of Sciences
2011-2021
RIKEN Center for Advanced Intelligence Project
2021
Beijing Haidian Hospital
2020
Shanghai Jiao Tong University
2019
Academy of Opto-Electronics
2011-2018
Deep Learning with noisy labels is a practically challenging problem in weakly-supervised learning. The state-of-the-art approaches "Decoupling" and "Co-teaching+" claim that the "disagreement" strategy crucial for alleviating of learning labels. In this paper, we start from different perspective propose robust paradigm called JoCoR, which aims to reduce diversity two networks during training. Specifically, first use make predictions on same mini-batch data calculate joint loss...
Partial label learning deals with the problem where each training instance is assigned a set of candidate labels, only one which correct. This paper provides first attempt to leverage idea self-training for dealing partially labeled examples. Specifically, we propose unified formulation proper constraints train desired model and perform pseudo-labeling jointly. For pseudo-labeling, unlike traditional that manually differentiates ground-truth enough high confidence, introduce maximum infinity...
Trained with the standard cross entropy loss, deep neural networks can achieve great performance on correctly labeled data. However, if training data is corrupted label noise, models tend to overfit noisy labels, thereby achieving poor generation performance. To remedy this issue, several loss functions have been proposed and demonstrated be robust noise. Although most of stem from Categorical Cross Entropy (CCE) they fail embody intrinsic relationships between CCE other functions. In paper,...
Learning with noisy labels (LNL) aims to ensure model generalization given a label-corrupted training set. In this work, we investigate rarely studied scenario of LNL on fine-grained datasets (LNL-FG), which is more practical and challenging as large inter-class ambiguities among classes cause labels. We empirically show that existing methods work well for fail achieve satisfying performance LNL-FG, arising the need effective solutions LNL-FG. To end, propose novel framework called...
In partial label learning, each training example is assigned a set of candidate labels, only one which the ground-truth label. Existing learning frameworks either assume equal confidence or consider as latent variable hidden in indiscriminate set, while different labeling levels labels are regrettably ignored. this paper, we formalize distributions, and propose novel unified framework to estimate distributions model simultaneously. Specifically, present biconvex formulation with constrained...
Partial label learning is a weakly supervised framework, in which each instance provided with multiple candidate labels while only one of them correct. Most the existing approaches focus on leveraging relationships to disambiguate given noisy space, it still unclear whether we can exploit potentially useful information space alleviate ambiguities. This paper gives positive answer this question for first time. Specifically, if two instances do not share any common labels, they cannot have...
In this paper, polycyclic aromatic hydrocarbons (PAHs) and soot formation in rich partially premixed flames nonpremixed were studied using a blend of n-heptane toluene. The diluted with Ar, N2, CO2 to control the flame temperature. Laser-induced fluorescence, laser-induced incandescence, two-color pyrometry used study effects temperature on PAHs evolution. Results show that distributions are similar for different gas dilutions at low flow rates. However, high-temperature area increases...
Partial-label learning (PLL) is a typical weakly supervised problem, where each training instance equipped with set of candidate labels among which only one the true label. Most existing methods elaborately designed objectives as constrained optimizations that must be solved in specific manners, making their computational complexity bottleneck for scaling up to big data. The goal this paper propose novel framework PLL flexibility on model and optimization algorithm. More specifically, we...
Videos flow as the mixture of language, acoustic, and vision modalities. A thorough video understanding needs to fuse time-series data different modalities for prediction. Due variable receiving frequency sequences from each modality, there usually exists inherent asynchrony across collected multimodal streams. Towards an efficient fusion asynchronous streams, we need model correlations between elements The recent Multimodal Transformer (MulT) approach extends self-attention mechanism...
Learning with Noisy Labels (LNL) has become an appealing topic, as imperfectly annotated data are relatively cheaper to obtain. Recent state-of-the-art approaches employ specific selection mechanisms separate clean and noisy samples then apply Semi-Supervised (SSL) techniques for improved performance. However, the step mostly provides a medium-sized decent-enough subset, which overlooks rich set of samples. To fulfill this, we propose novel LNL framework ProMix that attempts maximize utility...
Partial label learning (PLL) is an important problem that allows each training example to be labeled with a coarse candidate set the ground-truth included. However, in more practical but challenging scenario, annotator may miss and provide wrong set, which known as noisy PLL problem. To remedy this problem, we propose PiCO+ framework simultaneously disambiguates sets mitigates noise. Core PiCO+, develop novel disambiguation algorithm PiCO consists of contrastive module along class...
It is well-known that exploiting label correlations crucially important to multi-label learning. Most of the existing approaches take as prior knowledge, which may not correctly characterize real relationships among labels. Besides, are normally used regularize hypothesis space, while final predictions explicitly correlated. In this paper, we suggest for each individual label, prediction involves collaboration between its own and other Based on assumption, first propose a novel method learn...
Partial-label learning (PLL) is a multi-class classification problem, where each training example associated with set of candidate labels. Even though many practical PLL methods have been proposed in the last two decades, there lacks theoretical understanding consistency those methods-none hitherto possesses generation process label sets, and then it still unclear why such method works on specific dataset when may fail given different dataset. In this paper, we propose first model develop...