摘要

As the efficient calculation of eigenpairs of a matrix, especially, a general real matrix, is significant in engineering, and neural networks run asynchronously and can achieve high performance in calculation, this paper introduces a recurrent neural network (RNN) to extract some eigenpair. The RNN, whose connection weights are dependent upon the matrix, can be transformed into a complex differential system whose variable z(t) is a complex vector. By the analytic expression of vertical bar z(t)vertical bar(2), the convergence properties of the RNN are analyzed in detail. With general nonzero initial complex vector, the RNN obtains the largest imaginary part of all eigenvalues. By a rearrangement of connection matrix, the largest real part is obtained. A practice of a 7 x 7 matrix indicates the validity of this method. Two matrices, whose dimensionalities are 50 and 100, respectively, are employed to test the efficiency of this approach when dimension number becomes large. The results imply that the iteration number at which the network enters into equilibrium state is not sensitive with dimensionality. This RNN can be used to estimate the largest modulus of eigenvalues, etc. Compared with other neural networks designed for the similar aims, this RNN is applicable to general real matrices.