摘要

We propose adaptive constraint propagation (ACP) for semi-supervised kernel matrix learning (SS-KML). SS-KML aims to learn a kernel matrix from the given samples which contains just a little supervised information such as class labels or pairwise constraints. Recently, for effective SS-KML, constraint propagation methods by semi-definite programming are being actively studied, and the representative works are pairwise constraint propagation (PCP) and kernel propagation (KP). They have used hard constraints in their frameworks for constraint propagation and achieved outstanding classification performance. However, a small set of the hard constraints sometimes cannot cover all the samples of the full data set, and thus lead to large distortions of global discriminative data structure in the learned kernel matrix. It has a negative influence on the classification performance. To deal with this problem, we provide two adaptive fidelity terms to satisfy the requirement that two must-link samples become close and two cannot-link samples become far apart. Then, we build a new framework based on them to adaptively propagate the constraints. Experimental results demonstrate that ACP outperforms state-of-the-art SS-KML methods such as PCP and KP in terms of both effectiveness and efficiency.