Lingao Xiao

ORCID: 0009-0007-1697-1986
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Advanced Neural Network Applications
  • Anomaly Detection Techniques and Applications
  • Neural Networks and Applications
  • Machine Learning and Data Classification
  • Artificial Immune Systems Applications
  • Advanced Data Compression Techniques
  • Advanced Clustering Algorithms Research
  • Domain Adaptation and Few-Shot Learning
  • Data Stream Mining Techniques
  • Multimodal Machine Learning Applications
  • Advanced Data Processing Techniques
  • Adversarial Robustness in Machine Learning

Agency for Science, Technology and Research
2023

Institute of High Performance Computing
2023

The remarkable performance of deep Convolutional neural networks (CNNs) is generally attributed to their deeper and wider architectures, which can come with significant computational costs. Pruning has thus gained interest since it effectively lowers storage In contrast weight pruning, results in unstructured models, structured pruning provides the benefit realistic acceleration by producing models that are friendly hardware implementation. special requirements have led discovery numerous...

10.1109/tpami.2023.3334614 article EN IEEE Transactions on Pattern Analysis and Machine Intelligence 2023-11-28

Dataset distillation and dataset pruning are two prominent techniques for compressing datasets to improve computational storage efficiency. Despite their overlapping objectives, these approaches rarely compared directly. Even within each field, the evaluation protocols inconsistent across various methods, which complicates fair comparisons hinders reproducibility. Considering limitations, we introduce in this paper a benchmark that equitably evaluates methodologies both literatures. Notably,...

10.48550/arxiv.2502.06434 preprint EN arXiv (Cornell University) 2025-02-10

While dataset condensation effectively enhances training efficiency, its application in on-device scenarios brings unique challenges. 1) Due to the fluctuating computational resources of these devices, there's a demand for flexible size that diverges from predefined size. 2) The limited power on devices often prevents additional operations. These two challenges connect "subset degradation problem" traditional condensation: subset larger condensed is unrepresentative compared directly...

10.48550/arxiv.2403.06075 preprint EN arXiv (Cornell University) 2024-03-09

Dataset condensation is a crucial tool for enhancing training efficiency by reducing the size of dataset, particularly in on-device scenarios. However, these scenarios have two significant challenges: 1) varying computational resources available on devices require dataset different from pre-defined condensed and 2) limited often preclude possibility conducting additional processes. We introduce You Only Condense Once (YOCO) to overcome limitations. On top one YOCO produces smaller datasets...

10.48550/arxiv.2310.14019 preprint EN other-oa arXiv (Cornell University) 2023-01-01

In ImageNet-condensation, the storage for auxiliary soft labels exceeds that of condensed dataset by over 30 times. However, are large-scale necessary distillation? this paper, we first discover high within-class similarity in datasets necessitates use labels. This can be attributed to fact previous methods samples from different classes construct a single batch normalization (BN) matching. To reduce similarity, introduce class-wise supervision during image synthesizing process batching...

10.48550/arxiv.2410.15919 preprint EN arXiv (Cornell University) 2024-10-21

The remarkable performance of deep Convolutional neural networks (CNNs) is generally attributed to their deeper and wider architectures, which can come with significant computational costs. Pruning has thus gained interest since it effectively lowers storage In contrast weight pruning, results in unstructured models, structured pruning provides the benefit realistic acceleration by producing models that are friendly hardware implementation. special requirements have led discovery numerous...

10.48550/arxiv.2303.00566 preprint EN other-oa arXiv (Cornell University) 2023-01-01
Coming Soon ...