An Analysis of the Softmax Cross Entropy Loss for Learning-to-Rank with Binary Relevance

Learning to Rank Softmax function Ranking SVM Relevance Rank (graph theory) Ground truth
DOI: 10.1145/3341981.3344221 Publication Date: 2019-09-27T12:34:07Z
ABSTRACT
One of the challenges learning-to-rank for information retrieval is that ranking metrics are not smooth and as such cannot be optimized directly with gradient descent optimization methods. This gap has given rise to a large body research reformulates problem fit into existing machine learning frameworks or defines surrogate, ranking-appropriate loss function. ListNet's which measures cross entropy between distribution over documents obtained from scores another ground-truth labels. was designed capture permutation probabilities considered only loosely related metrics. In this work, however, we show above statement entirely accurate. fact, establish an analytical connection two popular in setup binary relevance particular, bounds Mean Reciprocal Rank Normalized Discounted Cumulative Gain. Our analysis sheds light on behavior explains its superior performance labeled data graded relevance.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (22)
CITATIONS (62)