摘要

In this paper, a measure of sensitivity is defined to evaluate fault tolerance of neural networks and we show that the sensitivity of a link is closely related to the amount of information passed through it. Based on this assumption, we prove that the distribution of output error caused by s-a-0 (stuck at 0) faults in an MLP network has a Gaussian distribution function. UDBP (Uniformly Distributed Back Propagation) algorithm is then introduced to minimize mean and variance of the output error. Then an MLP neural network trained with UDBP, contributes in an Algorithm-Based Fault Tolerant (ABFT) scheme to protect a nonlinear data process block. A systematic real convolution code guarantees that faults representing errors in the processed data will result in notable nonzero values in syndrome sequence. A majority logic decoder can now easily detect and correct single faults by observing the syndrome sequence. Simulation results demonstrating the error detection and correction behavior against random s-a-0 faults are presented too.

  • 出版日期2007-6