Limited-Supervised Multi-Label Learning with Dependency Noise

DOI: 10.1609/aaai.v38i14.29494 Publication Date: 2024-03-25T11:31:12Z
ABSTRACT
Limited-supervised multi-label learning (LML) leverages weak or noisy supervision for classification model training over data with label noise, which contain missing labels and/or redundant labels. Existing studies usually solve LML problems by assuming that noise is independent of the input features and class labels, while ignoring fact may depend on (instance-dependent) classes (label-dependent) in many real-world applications. In this paper, we propose limited-supervised Multi-label Learning Dependency Noise (MLDN) to simultaneously identify instance-dependent label-dependent factorizing matrix as outputs a mapping from feature representations. Meanwhile, regularize problem manifold constraint preserve local relationships uncover structure. Theoretically, bound recover error resulting problem. We using first-order scheme based proximal operator, convergence rate it at least sub-linear. Extensive experiments conducted various datasets demonstrate superiority our proposed method.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....