摘要

When all training instances and labels are considered all together in a single optimization problem, multi-label support and core vector machines (i.e., Rank-SVM and Rank-CVM) are formulated as quadratic prograniming (QP) problems with equality and bounded constraints, whose training procedures have a sub-linear convergence rate. Therefore it is highly desirable to design and implement a novel efficient SVM-type multi-label algorithm. In this paper, through applying pairwise constraints between relevant and irrelevant labels, and defining an approximate ranking loss, we generalize binary Lagrangian support vector machine (LSVM) to construct its multi-label form (Rank-LSVM), resulting into a strictly convex QP problem with non-negative constraints only. Particularly, each training instance is associated with a block of variables and all variables are divided naturally into manageable blocks. Consequently we build an efficient training procedure for Rank-LSVM using random block coordinate, descent method with a linear convergence rate. Moreover a heuristic strategy is applied to reduce the number of support vectors. Experimental results on twelve data sets demonstrate that our method works better according to five performance measures, and averagely runs 15 and 107 times faster and has 9 and 15% fewer support vectors, compared with Rank-CVM and Rank-SVM.