Test-Time Model Adaptation with Only Forward Passes
FOS: Computer and information sciences
Computer Science - Machine Learning
Machine Learning (cs.LG)
DOI:
10.48550/arxiv.2404.01650
Publication Date:
2024-04-02
AUTHORS (5)
ABSTRACT
Test-time adaptation has proven effective in adapting a given trained model to unseen test samples with potential distribution shifts. However, real-world scenarios, models are usually deployed on resource-limited devices, e.g., FPGAs, and often quantized hard-coded non-modifiable parameters for acceleration. In light of this, existing methods infeasible since they heavily depend computation-intensive backpropagation updating that may be not supported. To address we propose test-time Forward-Only Adaptation (FOA) method. FOA, seek solely learn newly added prompt (as model's input) via derivative-free covariance matrix evolution strategy. make this strategy work stably under our online unsupervised setting, devise novel fitness function by measuring test-training statistic discrepancy prediction entropy. Moreover, design an activation shifting scheme directly tunes the activations shifted samples, making them align source training domain, thereby further enhancing performance. Without using any altering weights, FOA runs 8-bit ViT outperforms gradient-based TENT full-precision 32-bit ViT, while achieving up 24-fold memory reduction ImageNet-C. The code will released.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....