摘要

This paper focuses on the powerful concept of modularity. It is descried how this concept is deployed in natural neural networks on an architectural as well as on a functional level. Furthermore, different approaches for modular neural networks are discussed. By means of these methods, a two-layer modular neural system is introduced. The basic building blocks of the architecture are multilayer perceptions (MP) with the backpropagation (BP) algorithm. This modular network is designed to combine two different approaches of generalization known from connectionist and logical neural networks; this enhances the generalization abilities of the network. Experiments described in this paper show that the architecture is especially useful in solving problems with a large number of input attributes.