Approximately optimizing NDCG using pair-wise loss

作者:Jin, Xiao-Bo; Geng, Guang-Gang; Xie, Guo-Sen; Huang, Kaizhu*
来源:Information Sciences, 2018, 453: 50-65.
DOI:10.1016/j.ins.2018.04.033

摘要

The Normalized Discounted Cumulative Gain (NDCG) is used to measure the performance of ranking algorithms. Much of the work on learning to rank by optimizing NDCG directly or indirectly is based on list-wise approaches. In our work, we approximately optimize a variant of NDCG called NDCG(beta) using pair-wise approaches. NDCG(beta) utilizes the linear discounting function. We first prove that the DCG error of NDCG(beta) is equal to the weighted pair-wise loss; then, on that basis, RankBoost(ndcg) and RankSVM(ndcg) are proposed to optimize the upper bound of the pair-wise 0-1 loss function. The experimental results from applying our approaches and ten other state-of-the-art methods to five public datasets show the superiority of the proposed methods, especially RankSVM(ndcg). In addition, RankBoost(ndcg) are less influenced by the initial weight distribution.