摘要

Linear discriminant analysis (LDA) is a traditional statistical technique that reduces dimensionality while preserving as much of the class discriminatory information as possible. The conventional form of the LDA assumes that all the data are available in advance and the LDA feature space is computed by finding the eigendecomposition of an appropriate matrix. However, there are situations where the data are presented in a sequence and the LDA features are required to be updated incrementally by observing the new incoming samples. Chatterjee and Roychowdhury proposed an algorithm for incrementally computing the LDA features followed by Moghaddam et al. who accelerated the convergence rate of these algorithms. The proposed algorithms by Moghaddam et al. are derived by applying the chain rule on an implicit cost function. Since the authors have not had access to the cost function they could not analyze the convergence of the proposed algorithms and the convergence of the proposed accelerated' techniques were not guaranteed. In this paper, we briefly review the previously proposed algorithms, then we derive new algorithms to accelerate the convergence rate of the incremental LDA algorithm given by Chatterjee and Roychowdhury. The proposed algorithms are derived by optimizing the step size in each iteration using steepest descent and conjugate direction methods. We test the performance of the proposed algorithms for incremental LDA on synthetic and real data sets. The simulation results confirm that the proposed algorithms estimate the LDA features faster than the gradient descent based algorithm presented by Moghaddam et al., and the algorithm proposed by Chatterjee and Roychowdhury.

  • 出版日期2015-6