Self-supervised Mean Teacher for Semi-supervised Chest X-ray Classification

Leverage (statistics) Margin (machine learning) Supervised Learning Labeled data
DOI: 10.48550/arxiv.2103.03629 Publication Date: 2021-01-01
ABSTRACT
The training of deep learning models generally requires a large amount annotated data for effective convergence and generalisation. However, obtaining high-quality annotations is laboursome expensive process due to the need expert radiologists labelling task. study semi-supervised in medical image analysis then crucial importance given that it much less obtain unlabelled images than acquire labelled by radiologists. Essentially, methods leverage sets enable better generalisation using only small set images. In this paper, we propose Self-supervised Mean Teacher Semi-supervised (S$^2$MTS$^2$) combines self-supervised mean-teacher pre-training with fine-tuning. main innovation S$^2$MTS$^2$ based on joint contrastive learning, which uses an infinite number pairs positive query key features improve representation. model fine-tuned exponential moving average teacher framework trained learning. We validate multi-label classification problems from Chest X-ray14 CheXpert, multi-class ISIC2018, where show outperforms previous SOTA margin.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....