摘要

Kernel methods are becoming increasingly popular for many real-world learning problems. And these methods for data analysis are frequently considered to be restricted to positive definite kernels. In practice, however, indefinite kernels arise and demand application in pattern analysis. In this paper, we present several formal extensions of kernel discriminant analysis (KDA) methods which can be used with indefinite kernels. In particular they include indefinite KDA (IKDA) based on generalized singular value decomposition (IKDA/GSVD), pseudo-inverse IKDA, null space IKDA and range space IKDA. Similar to the case of LDA-based algorithms, IKDA-based algorithms also fail to consider that different contribution of each pair of class to the discrimination. To remedy this problem, weighted schemes are incorporated into IKDA extensions in this paper and called them weighted generalized IKDA algorithms. Experiments on two real-world data sets are performed to test and evaluate the effectiveness of the proposed algorithms and the effect of weights on indefinite kernel functions. The results show that the effect of weighted schemes is very significantly.

全文