InsCLR: Improving Instance Retrieval with Self-Supervision
Code (set theory)
DOI:
10.1609/aaai.v36i1.19930
Publication Date:
2022-07-04T09:07:37Z
AUTHORS (4)
ABSTRACT
This work aims at improving instance retrieval with self-supervision. We find that fine-tuning using the recently developed self-supervised learning (SSL) methods, such as SimCLR and MoCo, fails to improve performance of retrieval. In this work, we identify learnt representations for should be invariant large variations in viewpoint background etc., whereas self-augmented positives applied by current SSL methods can not provide strong enough signals robust instance-level representations. To overcome problem, propose InsCLR, a new method builds on contrast, learn intra-class invariance dynamically mining meaningful pseudo positive samples from both mini-batches memory bank during training. Extensive experiments demonstrate InsCLR achieves similar or even better than state-of-the-art Code is available https://github.com/zeludeng/insclr.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (8)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....