Dynamic Adapter Meets Prompt Tuning: Parameter-Efficient Transfer Learning for Point Cloud Analysis

Adapter (computing) Transfer of learning
DOI: 10.48550/arxiv.2403.01439 Publication Date: 2024-03-03
ABSTRACT
Point cloud analysis has achieved outstanding performance by transferring point pre-trained models. However, existing methods for model adaptation usually update all parameters, i.e., full fine-tuning paradigm, which is inefficient as it relies on high computational costs (e.g., training GPU memory) and massive storage space. In this paper, we aim to study parameter-efficient transfer learning with an ideal trade-off between task parameter efficiency. To achieve goal, freeze the parameters of default models then propose Dynamic Adapter, generates a dynamic scale each token, considering token significance downstream task. We further seamlessly integrate Adapter Prompt Tuning (DAPT) constructing Internal Prompts, capturing instance-specific features interaction. Extensive experiments conducted five challenging datasets demonstrate that proposed DAPT achieves superior compared counterparts while significantly reducing trainable memory 95% 35%, respectively. Code available at https://github.com/LMD0311/DAPT.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....