LoOp: Looking for Optimal Hard Negative Embeddings for Deep Metric Learning
Benchmark (surveying)
Similarity (geometry)
DOI:
10.48550/arxiv.2108.09335
Publication Date:
2021-01-01
AUTHORS (5)
ABSTRACT
Deep metric learning has been effectively used to learn distance metrics for different visual tasks like image retrieval, clustering, etc. In order aid the training process, existing methods either use a hard mining strategy extract most informative samples or seek generate synthetics using an additional network. Such approaches face challenges and can lead biased embeddings in former case, (i) harder optimization (ii) slower speed (iii) higher model complexity latter case. overcome these challenges, we propose novel approach that looks optimal negatives (LoOp) embedding space, taking full advantage of each tuple by calculating minimum between pair positives negatives. Unlike mining-based methods, our considers entire space pairs calculate Extensive experiments combining representative losses reveal significant boost performance on three benchmark datasets.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....