Are Natural Domain Foundation Models Useful for Medical Image Classification?
Transferability
Foundation (evidence)
Baseline (sea)
DOI:
10.48550/arxiv.2310.19522
Publication Date:
2023-01-01
AUTHORS (6)
ABSTRACT
The deep learning field is converging towards the use of general foundation models that can be easily adapted for diverse tasks. While this paradigm shift has become common practice within natural language processing, progress been slower in computer vision. In paper we attempt to address issue by investigating transferability various state-of-the-art medical image classification Specifically, evaluate performance five models, namely SAM, SEEM, DINOv2, BLIP, and OpenCLIP across four well-established imaging datasets. We explore different training settings fully harness potential these models. Our study shows mixed results. DINOv2 consistently outperforms standard ImageNet pretraining. However, other failed beat established baseline indicating limitations their
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....