Wenxuan Fan

ORCID: 0000-0003-3621-2518
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Advanced Neural Network Applications
  • Domain Adaptation and Few-Shot Learning
  • Machine Learning and Data Classification
  • Autonomous Vehicle Technology and Safety
  • Adversarial Robustness in Machine Learning
  • Metabolism, Diabetes, and Cancer
  • Intelligent Tutoring Systems and Adaptive Learning
  • Pharmacology and Obesity Treatment
  • Human Pose and Action Recognition
  • Diabetes Treatment and Management
  • Video Analysis and Summarization

Dalian University of Technology
2021-2025

Tongji University
2024

Knowledge distillation is the process of transferring knowledge from a large model to small model. In this process, learns generalization ability and retains performance close that provides training means migrate models, facilitating deployment speeding up inference. However, previous methods require pre-trained teacher which still bring computational storage overheads. paper, novel general framework called Self Distillation (SD) proposed. We demonstrate effectiveness our method by...

10.48550/arxiv.2103.07350 preprint EN other-oa arXiv (Cornell University) 2021-01-01

Abstract Knowledge distillation is a widely used method to transfer knowledge from large model small model. Traditional methods use pre-trained models supervise the training of models, called Offline Distillation, However, structural gap between teachers and students limits its performance. After that, Oneline Distillation retrained teacher-student network beginning echo teaching greatly improved But there very little work explore difference two. In this paper, we first point out that...

10.1088/1742-6596/1982/1/012084 article EN Journal of Physics Conference Series 2021-07-01
Coming Soon ...