摘要

In recent years, neuromorphic hardware systems have significantly grown in size. With more and more neurons and synapses integrated in such systems, the neural connectivity and its configurability have become crucial design constraints. To tackle this problem, we introduce a generic extended graph description of connection topologies that allows a systematical analysis of connectivity in both neuromorphic hardware and neural network models. The unifying nature of our approach enables a close exchange between hardware and models. For an existing hardware system, the optimally matched network model can be extracted. Inversely, a hardware architecture may be fitted to a particular model network topology with our description method. As a further strength, the extended graph can be used to quantify the amount of configurability for a certain network topology. This is a hardware design variable that has widely been neglected, mainly because of a missing analysis method. To condense our analysis results, we develop a classification for the scaling complexity of network models and neuromorphic hardware, based on the total number of connections and the configurability. We find a gap between several models and existing hardware, making these hardware systems either impossible or inefficient to use for scaled-up network models. In this respect, our analysis results suggest models with locality in their connections as promising approach for tackling this scaling gap.

  • 出版日期2011-6