Multi-Sample based Contrastive Loss for Top-k Recommendation

Sample (material) Code (set theory) Convolution (computer science)
DOI: 10.48550/arxiv.2109.00217 Publication Date: 2021-01-01
ABSTRACT
The top-k recommendation is a fundamental task in systems which generally learned by comparing positive and negative pairs. Contrastive Loss (CL) the key contrastive learning that has received more attention recently we find it well suited for recommendations. However, problem CL treats importance of samples as same. On one hand, faces imbalance sample many samples. other items are so few sparser datasets their should be emphasized. Moreover, important issue sparse still not sufficiently utilized So propose new data augmentation method using multiple (or samples) simultaneously with loss function. Therefore, Multi-Sample based (MSCL) function solves two problems balancing augmentation. And on graph convolution network (GCN) method, experimental results demonstrate state-of-the-art performance MSCL. proposed MSCL simple can applied methods. We will release our code GitHub upon acceptance.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....